Dec 05 11:48:34 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 11:48:34 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 11:48:35 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 11:48:35 crc kubenswrapper[4763]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.630068 4763 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632566 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632585 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632590 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632595 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632604 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632609 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632622 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632627 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632631 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632636 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632640 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632644 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632648 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632652 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632655 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632659 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632662 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632667 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632672 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632676 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632680 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632685 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632689 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632693 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632697 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632700 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632704 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632709 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632712 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632716 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632720 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632724 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632727 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632731 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632735 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632739 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632742 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632746 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632750 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632754 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632776 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632780 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632783 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632787 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632790 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632794 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632797 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632801 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632804 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632808 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632811 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632816 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632821 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632828 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632832 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632836 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632840 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632844 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632848 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632851 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632856 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632860 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632864 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632869 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632873 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632876 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632880 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632885 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632889 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632893 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.632896 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633146 4763 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633157 4763 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633164 4763 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633170 4763 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633175 4763 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633181 4763 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633186 4763 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633192 4763 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633196 4763 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633200 4763 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633205 4763 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633209 4763 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633213 4763 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633217 4763 flags.go:64] FLAG: --cgroup-root="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633221 4763 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633225 4763 flags.go:64] FLAG: --client-ca-file="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633229 4763 flags.go:64] FLAG: --cloud-config="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633233 4763 flags.go:64] FLAG: --cloud-provider="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633237 4763 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633242 4763 flags.go:64] FLAG: --cluster-domain="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633246 4763 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633250 4763 flags.go:64] FLAG: --config-dir="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633254 4763 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633259 4763 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633264 4763 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633268 4763 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633272 4763 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633276 4763 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633280 4763 flags.go:64] FLAG: --contention-profiling="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633285 4763 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633289 4763 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633293 4763 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633297 4763 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633303 4763 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633307 4763 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633311 4763 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633315 4763 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633319 4763 flags.go:64] FLAG: --enable-server="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633323 4763 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633329 4763 flags.go:64] FLAG: --event-burst="100" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633333 4763 flags.go:64] FLAG: --event-qps="50" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633338 4763 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633342 4763 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633346 4763 flags.go:64] FLAG: --eviction-hard="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633360 4763 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633365 4763 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633369 4763 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633373 4763 flags.go:64] FLAG: --eviction-soft="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633377 4763 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633381 4763 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633385 4763 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633390 4763 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633393 4763 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633397 4763 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633401 4763 flags.go:64] FLAG: --feature-gates="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633406 4763 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633410 4763 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633415 4763 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633419 4763 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633423 4763 flags.go:64] FLAG: --healthz-port="10248" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633427 4763 flags.go:64] FLAG: --help="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633432 4763 flags.go:64] FLAG: --hostname-override="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633436 4763 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633440 4763 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633444 4763 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633449 4763 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633453 4763 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633457 4763 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633461 4763 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633464 4763 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633468 4763 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633472 4763 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633477 4763 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633481 4763 flags.go:64] FLAG: --kube-reserved="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633485 4763 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633488 4763 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633493 4763 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633497 4763 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633501 4763 flags.go:64] FLAG: --lock-file="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633505 4763 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633509 4763 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633513 4763 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633519 4763 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633523 4763 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633527 4763 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633531 4763 flags.go:64] FLAG: --logging-format="text" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633535 4763 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633543 4763 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633546 4763 flags.go:64] FLAG: --manifest-url="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633550 4763 flags.go:64] FLAG: --manifest-url-header="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633556 4763 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633560 4763 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633565 4763 flags.go:64] FLAG: --max-pods="110" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633569 4763 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633573 4763 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633577 4763 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633581 4763 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633587 4763 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633592 4763 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633596 4763 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633605 4763 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633610 4763 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633614 4763 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633618 4763 flags.go:64] FLAG: --pod-cidr="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633622 4763 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633627 4763 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633632 4763 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633636 4763 flags.go:64] FLAG: --pods-per-core="0" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633639 4763 flags.go:64] FLAG: --port="10250" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633650 4763 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633654 4763 flags.go:64] FLAG: --provider-id="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633658 4763 flags.go:64] FLAG: --qos-reserved="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633662 4763 flags.go:64] FLAG: --read-only-port="10255" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633667 4763 flags.go:64] FLAG: --register-node="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633671 4763 flags.go:64] FLAG: --register-schedulable="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633675 4763 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633683 4763 flags.go:64] FLAG: --registry-burst="10" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633688 4763 flags.go:64] FLAG: --registry-qps="5" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633692 4763 flags.go:64] FLAG: --reserved-cpus="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633698 4763 flags.go:64] FLAG: --reserved-memory="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633703 4763 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633708 4763 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633713 4763 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633718 4763 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633722 4763 flags.go:64] FLAG: --runonce="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633726 4763 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633731 4763 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633736 4763 flags.go:64] FLAG: --seccomp-default="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633740 4763 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633745 4763 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633750 4763 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633754 4763 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633774 4763 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633778 4763 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633782 4763 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633787 4763 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633791 4763 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633795 4763 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633800 4763 flags.go:64] FLAG: --system-cgroups="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633804 4763 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633811 4763 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633815 4763 flags.go:64] FLAG: --tls-cert-file="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633819 4763 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633825 4763 flags.go:64] FLAG: --tls-min-version="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633829 4763 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633834 4763 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633838 4763 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633842 4763 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633846 4763 flags.go:64] FLAG: --v="2" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633852 4763 flags.go:64] FLAG: --version="false" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633858 4763 flags.go:64] FLAG: --vmodule="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633867 4763 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.633872 4763 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634292 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634299 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634304 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634308 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634312 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634316 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634320 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634323 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634327 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634331 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634336 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634341 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634345 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634349 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634353 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634358 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634362 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634366 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634369 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634373 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634377 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634381 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634384 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634388 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634392 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634395 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634399 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634404 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634408 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634411 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634416 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634420 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634424 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634429 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634433 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634437 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634442 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634446 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634450 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634454 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634459 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634463 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634468 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634473 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634477 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634481 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634486 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634491 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634495 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634504 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634524 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634528 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634532 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634536 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634541 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634545 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634550 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634554 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634558 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634562 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634566 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634571 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634576 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634586 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634591 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634595 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634599 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634605 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634611 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634616 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.634621 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.634668 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.646818 4763 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.646875 4763 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.646972 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.646983 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.646989 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.646999 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647003 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647008 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647013 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647018 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647023 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647028 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647032 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647038 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647045 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647052 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647056 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647063 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647068 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647073 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647078 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647083 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647087 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647091 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647095 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647100 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647104 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647113 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647117 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647121 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647124 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647129 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647133 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647137 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647141 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647145 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647150 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647155 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647158 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647162 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647166 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647169 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647174 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647179 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647183 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647187 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647191 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647195 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647198 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647202 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647205 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647209 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647213 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647216 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647220 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647224 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647228 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647232 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647235 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647239 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647244 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647247 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647251 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647254 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647258 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647262 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647266 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647269 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647273 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647277 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647282 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647288 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647292 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.647302 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647440 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647452 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647456 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647461 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647464 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647468 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647472 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647476 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647480 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647484 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647488 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647491 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647495 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647500 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647505 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647510 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647514 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647518 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647522 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647526 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647530 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647534 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647538 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647542 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647545 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647549 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647553 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647558 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647562 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647566 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647570 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647574 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647578 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647582 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647586 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647590 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647594 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647599 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647603 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647607 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647612 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647616 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647620 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647624 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647628 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647632 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647635 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647639 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647642 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647645 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647649 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647653 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647656 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647661 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647665 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647669 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647673 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647677 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647681 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647685 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647689 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647692 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647696 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647700 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647704 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647708 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647713 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647717 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647721 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647724 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.647728 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.647735 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.648600 4763 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.651612 4763 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.651722 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.652288 4763 server.go:997] "Starting client certificate rotation" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.652313 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.652550 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-19 20:40:07.880563759 +0000 UTC Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.652709 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 344h51m32.227863461s for next certificate rotation Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.657354 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.659855 4763 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.673712 4763 log.go:25] "Validated CRI v1 runtime API" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.690820 4763 log.go:25] "Validated CRI v1 image API" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.693030 4763 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.696311 4763 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-11-43-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.696365 4763 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.715632 4763 manager.go:217] Machine: {Timestamp:2025-12-05 11:48:35.714162784 +0000 UTC m=+0.206877547 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0f54ec5d-9350-4cf9-9a1e-213de5460351 BootID:424a6449-0727-4c68-964f-19010b8ff35b Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b1:4e:32 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b1:4e:32 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f8:4e:5f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2f:91:86 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c4:f1:29 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:42:6f:78 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:94:e3:c5:f9:65 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:f1:1e:cd:4c:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.715951 4763 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.716436 4763 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717260 4763 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717449 4763 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717485 4763 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717721 4763 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717734 4763 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.717969 4763 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.718001 4763 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.718236 4763 state_mem.go:36] "Initialized new in-memory state store" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.718329 4763 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.719118 4763 kubelet.go:418] "Attempting to sync node with API server" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.719140 4763 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.719166 4763 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.719181 4763 kubelet.go:324] "Adding apiserver pod source" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.719194 4763 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.721350 4763 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.722021 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.723956 4763 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.724010 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.724043 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.724162 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.724166 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724535 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724575 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724589 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724601 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724622 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724634 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724646 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724664 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724689 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724701 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724719 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724734 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.724992 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.725655 4763 server.go:1280] "Started kubelet" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.726285 4763 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.726791 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.726524 4763 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 11:48:35 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.727981 4763 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.727660 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e4f51915dad72 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 11:48:35.72559397 +0000 UTC m=+0.218308703,LastTimestamp:2025-12-05 11:48:35.72559397 +0000 UTC m=+0.218308703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729098 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729142 4763 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729266 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:20:37.930712505 +0000 UTC Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729386 4763 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729411 4763 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.729485 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.729595 4763 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.734236 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.734438 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.734514 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.735032 4763 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.735424 4763 factory.go:55] Registering systemd factory Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.735463 4763 factory.go:221] Registration of the systemd container factory successfully Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.736030 4763 factory.go:153] Registering CRI-O factory Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.736086 4763 factory.go:221] Registration of the crio container factory successfully Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.736125 4763 factory.go:103] Registering Raw factory Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.736145 4763 manager.go:1196] Started watching for new ooms in manager Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.736199 4763 server.go:460] "Adding debug handlers to kubelet server" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.738209 4763 manager.go:319] Starting recovery of all containers Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741177 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741222 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741234 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741245 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741256 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741266 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741277 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741287 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741300 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741310 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741322 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741333 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741347 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741360 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741377 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741388 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741398 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741408 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741418 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741429 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741439 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741450 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741461 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741472 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741489 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741505 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741522 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741538 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741558 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741571 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741583 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741596 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741610 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741622 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741636 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741648 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741662 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741675 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741703 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741717 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741730 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741744 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741758 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741805 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741819 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741832 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741844 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741857 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741870 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741882 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741895 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741907 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741927 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741941 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741961 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741973 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741985 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.741995 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742006 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742017 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742027 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742037 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742673 4763 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742753 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742786 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742803 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742819 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742832 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742848 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742860 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742871 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742882 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742893 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742907 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742921 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742934 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742946 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742958 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742966 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742979 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.742992 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743004 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743017 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743031 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743044 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743058 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743071 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743085 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743099 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743113 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743128 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743143 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743157 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743171 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743217 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743234 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743246 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743261 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743275 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743290 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743304 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743318 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743331 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743345 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743358 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743379 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743420 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743435 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743451 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743465 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743479 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743492 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743508 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743523 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743540 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743555 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743572 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743587 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743610 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743624 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743638 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743652 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743668 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743681 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743696 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743710 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743730 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743742 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743754 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743784 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743797 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743811 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743826 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743837 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743850 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743863 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743877 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743891 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743905 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743918 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743932 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743947 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743963 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743977 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.743990 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744004 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744019 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744034 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744052 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744066 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744081 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744097 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744111 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744126 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744141 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744157 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744170 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744184 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744202 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744215 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744227 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744240 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744252 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744264 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744277 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744292 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744305 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744319 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744333 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744347 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744359 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744374 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744387 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744404 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744415 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744430 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744442 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744454 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744466 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744480 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744495 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744507 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744520 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744535 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744551 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744565 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744578 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744590 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744603 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744616 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744630 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744644 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744658 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744671 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744686 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744699 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744718 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744731 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744745 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744801 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744817 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744830 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744845 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744860 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744875 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744890 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744904 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744917 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744929 4763 reconstruct.go:97] "Volume reconstruction finished" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.744938 4763 reconciler.go:26] "Reconciler: start to sync state" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.761629 4763 manager.go:324] Recovery completed Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.770581 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.772633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.772691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.772703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.774443 4763 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.774461 4763 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.774484 4763 state_mem.go:36] "Initialized new in-memory state store" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.780536 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.782641 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.782690 4763 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.782720 4763 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.782861 4763 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 11:48:35 crc kubenswrapper[4763]: W1205 11:48:35.819726 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.819828 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.828067 4763 policy_none.go:49] "None policy: Start" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.829479 4763 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.829504 4763 state_mem.go:35] "Initializing new in-memory state store" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.829589 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.883010 4763 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.905263 4763 manager.go:334] "Starting Device Plugin manager" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.905356 4763 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.905373 4763 server.go:79] "Starting device plugin registration server" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.905941 4763 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.905967 4763 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.906143 4763 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.906299 4763 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 11:48:35 crc kubenswrapper[4763]: I1205 11:48:35.906315 4763 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.921902 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 11:48:35 crc kubenswrapper[4763]: E1205 11:48:35.935152 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.006346 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.007869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.007919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.007941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.007987 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.008605 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.083723 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.083930 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.085671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.085732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.085755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.086037 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.086452 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.086529 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087660 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.087952 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.088583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.088631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.088652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.089876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.090073 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.090235 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.090307 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.091572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.091624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.091649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092450 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.092710 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.093948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.093978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.093997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.094015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.094016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.094124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.094385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.094430 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.095940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.095990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.096007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.149838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.149936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.149988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150233 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150351 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150416 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.150655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.209481 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.210656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.210721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.210746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.210820 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.211309 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252588 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.253065 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.252757 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.253063 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.253129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.253137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.253008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.336944 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.429447 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.458102 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.466816 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1b3ed1e503de74950d9ac506ff63cceafeaea3ac49f7a584e84c0eb098932b0b WatchSource:0}: Error finding container 1b3ed1e503de74950d9ac506ff63cceafeaea3ac49f7a584e84c0eb098932b0b: Status 404 returned error can't find the container with id 1b3ed1e503de74950d9ac506ff63cceafeaea3ac49f7a584e84c0eb098932b0b Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.472737 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.483011 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e344251e13e674da276b3c0c6f4e4b49ce722841f5dfebc94392c1b531ea2999 WatchSource:0}: Error finding container e344251e13e674da276b3c0c6f4e4b49ce722841f5dfebc94392c1b531ea2999: Status 404 returned error can't find the container with id e344251e13e674da276b3c0c6f4e4b49ce722841f5dfebc94392c1b531ea2999 Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.493808 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.500231 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-970388bc2480fa025da33ae0ca44521c41b6fd2fbc4ab8ec2838a0b10cda7b0f WatchSource:0}: Error finding container 970388bc2480fa025da33ae0ca44521c41b6fd2fbc4ab8ec2838a0b10cda7b0f: Status 404 returned error can't find the container with id 970388bc2480fa025da33ae0ca44521c41b6fd2fbc4ab8ec2838a0b10cda7b0f Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.500831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.513169 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ea5617953a1d7479dc8600ccc9ebfe0f008ac534bc7b2e09035c9e9dad753a4d WatchSource:0}: Error finding container ea5617953a1d7479dc8600ccc9ebfe0f008ac534bc7b2e09035c9e9dad753a4d: Status 404 returned error can't find the container with id ea5617953a1d7479dc8600ccc9ebfe0f008ac534bc7b2e09035c9e9dad753a4d Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.524940 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8d75424acfe7e2dcb6224a169c3d27fa3b8354d774fa903d92b5ce198782c091 WatchSource:0}: Error finding container 8d75424acfe7e2dcb6224a169c3d27fa3b8354d774fa903d92b5ce198782c091: Status 404 returned error can't find the container with id 8d75424acfe7e2dcb6224a169c3d27fa3b8354d774fa903d92b5ce198782c091 Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.611937 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.614067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.614122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.614136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.614182 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.614901 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.642108 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.642240 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.660880 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.661017 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.728463 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.729424 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:47:18.690211206 +0000 UTC Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.787039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8d75424acfe7e2dcb6224a169c3d27fa3b8354d774fa903d92b5ce198782c091"} Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.788188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ea5617953a1d7479dc8600ccc9ebfe0f008ac534bc7b2e09035c9e9dad753a4d"} Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.789793 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"970388bc2480fa025da33ae0ca44521c41b6fd2fbc4ab8ec2838a0b10cda7b0f"} Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.790534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e344251e13e674da276b3c0c6f4e4b49ce722841f5dfebc94392c1b531ea2999"} Dec 05 11:48:36 crc kubenswrapper[4763]: I1205 11:48:36.792292 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1b3ed1e503de74950d9ac506ff63cceafeaea3ac49f7a584e84c0eb098932b0b"} Dec 05 11:48:36 crc kubenswrapper[4763]: W1205 11:48:36.830267 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:36 crc kubenswrapper[4763]: E1205 11:48:36.830342 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:37 crc kubenswrapper[4763]: E1205 11:48:37.138913 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Dec 05 11:48:37 crc kubenswrapper[4763]: W1205 11:48:37.327203 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:37 crc kubenswrapper[4763]: E1205 11:48:37.327280 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.415137 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.416292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.416316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.416325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.416344 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:37 crc kubenswrapper[4763]: E1205 11:48:37.416812 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.728358 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.730073 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 11:53:17.046762772 +0000 UTC Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.730127 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 216h4m39.316640871s for next certificate rotation Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.797547 4763 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0" exitCode=0 Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.797678 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.797741 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.799340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.799378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.799392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.802169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.802235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.802264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.802275 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.802288 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.804592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.804698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.804737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.805015 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec" exitCode=0 Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.805251 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.805171 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.808637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.808669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.808689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.810103 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9" exitCode=0 Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.810298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.810458 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812298 4763 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="decc233b0a3fa473b6f29afd1dc70853f594567111d7270cc8784d3fe8c37ffe" exitCode=0 Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"decc233b0a3fa473b6f29afd1dc70853f594567111d7270cc8784d3fe8c37ffe"} Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.812817 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.813242 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.814522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:37 crc kubenswrapper[4763]: I1205 11:48:37.951184 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:38 crc kubenswrapper[4763]: E1205 11:48:38.092330 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e4f51915dad72 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 11:48:35.72559397 +0000 UTC m=+0.218308703,LastTimestamp:2025-12-05 11:48:35.72559397 +0000 UTC m=+0.218308703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 11:48:38 crc kubenswrapper[4763]: W1205 11:48:38.373418 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Dec 05 11:48:38 crc kubenswrapper[4763]: E1205 11:48:38.373540 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.817142 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756" exitCode=0 Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.817241 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.817279 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.818340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.818364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.818375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.820608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"48b017a073e89efb0646e4f13b985691b1d30366859803685d2b82b245ff7abe"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.820688 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.821628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.821647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.821658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.824994 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.825949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.829583 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.829983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830026 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4"} Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:38 crc kubenswrapper[4763]: I1205 11:48:38.830553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.016981 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.018211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.018251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.018261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.018287 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.837383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd"} Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.837497 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.839051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.839129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.839157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842665 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a" exitCode=0 Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842796 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842899 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a"} Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842930 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.842987 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.843502 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:39 crc kubenswrapper[4763]: I1205 11:48:39.844933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.128131 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.329452 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2"} Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850872 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850888 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850934 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464"} Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.850983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012"} Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.851004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f"} Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.851056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc"} Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.852167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.863930 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.864066 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.865025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.865060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.865074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.869552 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.951808 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 11:48:40 crc kubenswrapper[4763]: I1205 11:48:40.951933 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.854158 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.854269 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.854321 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.854269 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.854388 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.856856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:41 crc kubenswrapper[4763]: I1205 11:48:41.995969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.648120 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.857621 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.857621 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.857636 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:42 crc kubenswrapper[4763]: I1205 11:48:42.859539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:44 crc kubenswrapper[4763]: I1205 11:48:44.374240 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:44 crc kubenswrapper[4763]: I1205 11:48:44.374508 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:44 crc kubenswrapper[4763]: I1205 11:48:44.376143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:44 crc kubenswrapper[4763]: I1205 11:48:44.376198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:44 crc kubenswrapper[4763]: I1205 11:48:44.376210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:45 crc kubenswrapper[4763]: I1205 11:48:45.795821 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:48:45 crc kubenswrapper[4763]: I1205 11:48:45.796131 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:45 crc kubenswrapper[4763]: I1205 11:48:45.797499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:45 crc kubenswrapper[4763]: I1205 11:48:45.797552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:45 crc kubenswrapper[4763]: I1205 11:48:45.797571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:45 crc kubenswrapper[4763]: E1205 11:48:45.922016 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.043231 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.043391 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.044705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.044750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.044786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.326094 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.326280 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.327253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.327284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:47 crc kubenswrapper[4763]: I1205 11:48:47.327292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:48 crc kubenswrapper[4763]: I1205 11:48:48.728678 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 11:48:48 crc kubenswrapper[4763]: E1205 11:48:48.745036 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 11:48:48 crc kubenswrapper[4763]: W1205 11:48:48.864413 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 11:48:48 crc kubenswrapper[4763]: I1205 11:48:48.864503 4763 trace.go:236] Trace[924032522]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 11:48:38.862) (total time: 10001ms): Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[924032522]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:48:48.864) Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[924032522]: [10.001658502s] [10.001658502s] END Dec 05 11:48:48 crc kubenswrapper[4763]: E1205 11:48:48.864522 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 11:48:48 crc kubenswrapper[4763]: W1205 11:48:48.961957 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 11:48:48 crc kubenswrapper[4763]: I1205 11:48:48.962058 4763 trace.go:236] Trace[478574093]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 11:48:38.960) (total time: 10001ms): Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[478574093]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:48:48.961) Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[478574093]: [10.001601676s] [10.001601676s] END Dec 05 11:48:48 crc kubenswrapper[4763]: E1205 11:48:48.962081 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 11:48:48 crc kubenswrapper[4763]: W1205 11:48:48.967988 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 11:48:48 crc kubenswrapper[4763]: I1205 11:48:48.968074 4763 trace.go:236] Trace[1995200734]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 11:48:38.967) (total time: 10000ms): Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[1995200734]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (11:48:48.967) Dec 05 11:48:48 crc kubenswrapper[4763]: Trace[1995200734]: [10.000770838s] [10.000770838s] END Dec 05 11:48:48 crc kubenswrapper[4763]: E1205 11:48:48.968098 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 11:48:49 crc kubenswrapper[4763]: E1205 11:48:49.020320 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 11:48:49 crc kubenswrapper[4763]: I1205 11:48:49.783992 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 11:48:49 crc kubenswrapper[4763]: I1205 11:48:49.784597 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 11:48:49 crc kubenswrapper[4763]: I1205 11:48:49.792284 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 11:48:49 crc kubenswrapper[4763]: I1205 11:48:49.792372 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 11:48:50 crc kubenswrapper[4763]: I1205 11:48:50.335248 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]log ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]etcd ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-apiextensions-informers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/crd-informer-synced ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 11:48:50 crc kubenswrapper[4763]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/bootstrap-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-registration-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]autoregister-completion ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 11:48:50 crc kubenswrapper[4763]: livez check failed Dec 05 11:48:50 crc kubenswrapper[4763]: I1205 11:48:50.335327 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:48:50 crc kubenswrapper[4763]: I1205 11:48:50.952911 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 11:48:50 crc kubenswrapper[4763]: I1205 11:48:50.952986 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.221293 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.222664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.222714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.222731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.222824 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.659401 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 11:48:52 crc kubenswrapper[4763]: I1205 11:48:52.978124 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 11:48:53 crc kubenswrapper[4763]: I1205 11:48:53.675871 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.783665 4763 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.783789 4763 trace.go:236] Trace[503505308]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 11:48:42.924) (total time: 11859ms): Dec 05 11:48:54 crc kubenswrapper[4763]: Trace[503505308]: ---"Objects listed" error: 11858ms (11:48:54.783) Dec 05 11:48:54 crc kubenswrapper[4763]: Trace[503505308]: [11.85900108s] [11.85900108s] END Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.783805 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.844277 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.844302 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.844343 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.844359 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.891226 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.893295 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd" exitCode=255 Dec 05 11:48:54 crc kubenswrapper[4763]: I1205 11:48:54.893341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd"} Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.063243 4763 scope.go:117] "RemoveContainer" containerID="3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.334159 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.732303 4763 apiserver.go:52] "Watching apiserver" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.734992 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.735371 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbr2p","openshift-machine-config-operator/machine-config-daemon-xpgln","openshift-multus/multus-additional-cni-plugins-92qpt","openshift-multus/multus-kwkp4","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-gt7x4","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h"] Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.735684 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.735701 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.735745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.735881 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.735973 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.736269 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.736313 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.736387 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.736438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.736705 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.737021 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.737021 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.737117 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.737072 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.740555 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.740991 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.741083 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.741189 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.741195 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.741221 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.741246 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.742449 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.742711 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.742816 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743011 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743182 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743199 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743321 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743332 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743467 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743513 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743686 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.743824 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.744375 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.745960 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746136 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746284 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746427 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746560 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746711 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.746748 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.747010 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.747265 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.750122 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.750233 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.762642 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.773281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.784789 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.794004 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.802132 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.813431 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.823602 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.831217 4763 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.833549 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.845169 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.854110 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.863058 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.870349 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.884084 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889466 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889511 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889527 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889541 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889557 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889572 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889587 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889601 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889646 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889674 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889694 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889709 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889755 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889833 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889848 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889876 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889892 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889948 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.889962 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890211 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890241 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890364 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890387 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890421 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890458 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890526 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890543 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890577 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890616 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890728 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890768 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890808 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890828 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890860 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.890984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891000 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891018 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891037 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891354 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891515 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891541 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891687 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891706 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891739 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.891957 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892071 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892088 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892153 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892189 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892205 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892223 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892312 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892330 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892366 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892384 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892554 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892571 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892608 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892797 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892940 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892958 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892977 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.892995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893105 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893158 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893279 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893333 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893402 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893421 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893471 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893491 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.893547 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.894704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899388 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899496 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899794 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899818 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900051 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900068 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900093 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900164 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900469 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900600 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900715 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900767 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900514 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900864 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.900944 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901114 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.899947 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901523 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901747 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901755 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901805 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.901912 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902033 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902042 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902264 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902437 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902452 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902478 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902629 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902637 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902832 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.902857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903016 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903371 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903415 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903623 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903868 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.903976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904431 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904344 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904412 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904598 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.904576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905003 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.905310 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:48:56.40525674 +0000 UTC m=+20.897971463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909384 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909523 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909705 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909841 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909950 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910443 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910930 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911612 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911828 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911854 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911928 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.912210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909413 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905607 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905623 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.906155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.906184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.906235 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.906388 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.906379 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.907173 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.907299 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.907354 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.907816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.907927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908127 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908296 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908310 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908467 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908590 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.908645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909090 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909433 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.909973 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910141 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910115 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910362 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910376 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.910995 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911020 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911216 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.905522 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.911594 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.912530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.912722 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.912821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.912868 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913223 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913386 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913394 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913273 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913789 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.913977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914112 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914191 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914314 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.914970 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915029 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915065 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915084 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915594 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915630 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915449 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915915 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915965 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.915756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916010 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916084 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916141 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916157 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916182 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916198 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916464 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916481 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916501 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916710 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.916907 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917132 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917162 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917231 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917260 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917267 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917341 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917388 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917580 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917586 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917685 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917811 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.917918 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918506 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918531 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918585 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918609 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918654 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918677 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918701 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/96338136-6831-49d0-9eb9-77d1205c6afb-rootfs\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-system-cni-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cnibin\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjxx\" (UniqueName: \"kubernetes.io/projected/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-kube-api-access-jpjxx\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919006 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919029 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-cnibin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-bin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919167 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919192 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96338136-6831-49d0-9eb9-77d1205c6afb-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919201 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919214 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919243 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-multus-certs\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79tg\" (UniqueName: \"kubernetes.io/projected/737ae453-c22e-41ea-a10e-7e8f1f165467-kube-api-access-q79tg\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nq2w\" (UniqueName: \"kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919451 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-kubelet\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-os-release\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919572 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96338136-6831-49d0-9eb9-77d1205c6afb-proxy-tls\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919605 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919622 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919644 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919699 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919716 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-os-release\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-netns\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919838 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918133 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.919916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920082 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920218 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2515d84f-5782-48a0-9d7e-9704baebe26c-hosts-file\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-socket-dir-parent\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920451 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-hostroot\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920707 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920866 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920899 4763 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921261 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.921383 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.921582 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:56.421564169 +0000 UTC m=+20.914278982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921947 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.921469 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922197 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.920925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgs2t\" (UniqueName: \"kubernetes.io/projected/96338136-6831-49d0-9eb9-77d1205c6afb-kube-api-access-xgs2t\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkb4\" (UniqueName: \"kubernetes.io/projected/2515d84f-5782-48a0-9d7e-9704baebe26c-kube-api-access-lxkb4\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-k8s-cni-cncf-io\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922460 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-multus\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-daemon-config\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922494 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-system-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.922594 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922598 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-conf-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922621 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-etc-kubernetes\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.922650 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:56.422632045 +0000 UTC m=+20.915346858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922675 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.922984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-cni-binary-copy\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923103 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923122 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923135 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923147 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923158 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923170 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923184 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923194 4763 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923206 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923217 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923228 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923240 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923250 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923260 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923271 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923281 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923291 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923302 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923315 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923327 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923338 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923350 4763 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923362 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923373 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923386 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923397 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923410 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923424 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923438 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923451 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923462 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923473 4763 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923520 4763 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923534 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923547 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923558 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923568 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923579 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923591 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.918147 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923656 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923670 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923680 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923690 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923699 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923710 4763 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923719 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923730 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923741 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923750 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923760 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923768 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923800 4763 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923811 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923824 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923833 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923843 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923852 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923862 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923872 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923882 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923891 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923901 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923910 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923919 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923928 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923937 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923945 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923954 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923963 4763 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923972 4763 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923981 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923990 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.923998 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924007 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924015 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924024 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924033 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924042 4763 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924050 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924059 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924067 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924076 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924084 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924093 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924103 4763 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924111 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924119 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924128 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924136 4763 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924145 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924153 4763 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924162 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924171 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924181 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924190 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924198 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924207 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924276 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924286 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924294 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924302 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924311 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924321 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924329 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924338 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924346 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924355 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924364 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924372 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924380 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924389 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924397 4763 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924405 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924415 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924425 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924435 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924443 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924452 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924461 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924470 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924478 4763 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924486 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924494 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924509 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924540 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924550 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924558 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924569 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924578 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924587 4763 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924595 4763 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924605 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924614 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924622 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924634 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924645 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924657 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924668 4763 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924679 4763 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924689 4763 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924697 4763 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924706 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924714 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924723 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924730 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924740 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924748 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924771 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924781 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924793 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924805 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924815 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924824 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924832 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924841 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924852 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924859 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924868 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924877 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924885 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924894 4763 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924903 4763 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924912 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924921 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924930 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.924938 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.926946 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.929736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4"} Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.930457 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.930859 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.931883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.935330 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.939014 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.939693 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.939808 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940261 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940287 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940302 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.940297 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940355 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940384 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940362 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:56.440344952 +0000 UTC m=+20.933059745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940418 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.940498 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:56.440461462 +0000 UTC m=+20.933176185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.940877 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.942578 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.943306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: E1205 11:48:55.944054 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.944832 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.945034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.945354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.946715 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.946855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.947565 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.948122 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.954503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.954604 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.954734 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.954859 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.954974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.955128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.955249 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.955695 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.955747 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.957052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.957418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.958073 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.958268 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.965199 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.968155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.972227 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.975446 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.978920 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.982281 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.988017 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:55 crc kubenswrapper[4763]: I1205 11:48:55.997328 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.004373 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.011628 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.020408 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-os-release\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025759 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nq2w\" (UniqueName: \"kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-kubelet\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96338136-6831-49d0-9eb9-77d1205c6afb-proxy-tls\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.025994 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026012 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-os-release\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026031 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-netns\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026131 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2515d84f-5782-48a0-9d7e-9704baebe26c-hosts-file\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-kubelet\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-socket-dir-parent\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-hostroot\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgs2t\" (UniqueName: \"kubernetes.io/projected/96338136-6831-49d0-9eb9-77d1205c6afb-kube-api-access-xgs2t\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkb4\" (UniqueName: \"kubernetes.io/projected/2515d84f-5782-48a0-9d7e-9704baebe26c-kube-api-access-lxkb4\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026262 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-os-release\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-k8s-cni-cncf-io\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-multus\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026358 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-daemon-config\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-system-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026468 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-conf-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-etc-kubernetes\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-socket-dir-parent\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-cni-binary-copy\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026559 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/96338136-6831-49d0-9eb9-77d1205c6afb-rootfs\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-system-cni-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cnibin\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjxx\" (UniqueName: \"kubernetes.io/projected/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-kube-api-access-jpjxx\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-cnibin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-system-cni-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-bin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026733 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026797 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-multus-certs\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79tg\" (UniqueName: \"kubernetes.io/projected/737ae453-c22e-41ea-a10e-7e8f1f165467-kube-api-access-q79tg\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96338136-6831-49d0-9eb9-77d1205c6afb-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026858 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-daemon-config\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026888 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026932 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-hostroot\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-cnibin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027086 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-bin\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027089 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-k8s-cni-cncf-io\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-os-release\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-multus-conf-dir\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-netns\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-etc-kubernetes\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027405 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.026629 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2515d84f-5782-48a0-9d7e-9704baebe26c-hosts-file\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027723 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-run-multus-certs\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027741 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-system-cni-dir\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/96338136-6831-49d0-9eb9-77d1205c6afb-rootfs\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027786 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cnibin\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027885 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/737ae453-c22e-41ea-a10e-7e8f1f165467-cni-binary-copy\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.027924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028005 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/737ae453-c22e-41ea-a10e-7e8f1f165467-host-var-lib-cni-multus\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028220 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028236 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028256 4763 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028268 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028280 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028290 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028300 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028312 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028324 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028334 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028345 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028356 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028367 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028379 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028392 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028405 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028416 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028520 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028533 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028544 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028555 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028566 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028577 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028592 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028603 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028615 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028627 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028640 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.028841 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.029981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.030654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96338136-6831-49d0-9eb9-77d1205c6afb-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.030739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.031195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96338136-6831-49d0-9eb9-77d1205c6afb-proxy-tls\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.031889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.035671 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.049027 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.049393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgs2t\" (UniqueName: \"kubernetes.io/projected/96338136-6831-49d0-9eb9-77d1205c6afb-kube-api-access-xgs2t\") pod \"machine-config-daemon-xpgln\" (UID: \"96338136-6831-49d0-9eb9-77d1205c6afb\") " pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.049461 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.049972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nq2w\" (UniqueName: \"kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w\") pod \"ovnkube-node-xbr2p\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.051985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkb4\" (UniqueName: \"kubernetes.io/projected/2515d84f-5782-48a0-9d7e-9704baebe26c-kube-api-access-lxkb4\") pod \"node-resolver-gt7x4\" (UID: \"2515d84f-5782-48a0-9d7e-9704baebe26c\") " pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.052394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79tg\" (UniqueName: \"kubernetes.io/projected/737ae453-c22e-41ea-a10e-7e8f1f165467-kube-api-access-q79tg\") pod \"multus-kwkp4\" (UID: \"737ae453-c22e-41ea-a10e-7e8f1f165467\") " pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.058590 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.058922 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.060860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjxx\" (UniqueName: \"kubernetes.io/projected/eb92473b-8e13-46cd-9c26-9ef67d1d6e5f-kube-api-access-jpjxx\") pod \"multus-additional-cni-plugins-92qpt\" (UID: \"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\") " pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.064435 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.065645 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.070754 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gt7x4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.075872 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.078388 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kwkp4" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.084914 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.094019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92qpt" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.095144 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.101998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.109396 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.117012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: W1205 11:48:56.117336 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737ae453_c22e_41ea_a10e_7e8f1f165467.slice/crio-0c59bed3d3b3135663c0697cbaaaf45ee8b3fb04d9dedbd15d72084b5e616a11 WatchSource:0}: Error finding container 0c59bed3d3b3135663c0697cbaaaf45ee8b3fb04d9dedbd15d72084b5e616a11: Status 404 returned error can't find the container with id 0c59bed3d3b3135663c0697cbaaaf45ee8b3fb04d9dedbd15d72084b5e616a11 Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.126944 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.138121 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.150960 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: W1205 11:48:56.162357 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb92473b_8e13_46cd_9c26_9ef67d1d6e5f.slice/crio-e62850894dbd9194b41f959e031d7196b09bcb44499723f230bbfd1016de3820 WatchSource:0}: Error finding container e62850894dbd9194b41f959e031d7196b09bcb44499723f230bbfd1016de3820: Status 404 returned error can't find the container with id e62850894dbd9194b41f959e031d7196b09bcb44499723f230bbfd1016de3820 Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.178254 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.236505 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.254389 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.432606 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.432742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.432826 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:48:57.432792445 +0000 UTC m=+21.925507168 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.432887 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.432923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.432940 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:57.432929245 +0000 UTC m=+21.925643968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.433050 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.433095 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:57.433084636 +0000 UTC m=+21.925799429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.534123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.534184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534327 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534372 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534387 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534327 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534447 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:57.53442825 +0000 UTC m=+22.027143003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534458 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534469 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:56 crc kubenswrapper[4763]: E1205 11:48:56.534497 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:57.534487581 +0000 UTC m=+22.027202304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.934624 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gt7x4" event={"ID":"2515d84f-5782-48a0-9d7e-9704baebe26c","Type":"ContainerStarted","Data":"d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.935033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gt7x4" event={"ID":"2515d84f-5782-48a0-9d7e-9704baebe26c","Type":"ContainerStarted","Data":"ec0c0208740a04e02400168922980329b8d0c423ce3af4f23692ca471d419a78"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.936032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.936076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"555c5528c8ac878492e81aeffe17b9188c7fb6df2359e76cd4608669d00fff37"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.938011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.938036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.938049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5e57496380a3cc5b7e23f8ed11c6c8d5b5be130c337005233a8b157df4088cc3"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.939468 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" exitCode=0 Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.939530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.939604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"6bb60da1b1fd6adf6e9fec7d889a865b4e5ac81db0f9ddf11613c9813d4807a3"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.940856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerStarted","Data":"a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.940880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerStarted","Data":"0c59bed3d3b3135663c0697cbaaaf45ee8b3fb04d9dedbd15d72084b5e616a11"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.942320 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.942348 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.942360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"a2c322c53f56fee1c06116adb06bd93b97ba6016b639a29181ca8883b44e6ee9"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.944646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"218c4dcd9c6a6f8a44d3ccbe1c062cf408b558b30819cd7f459a2a17fc8cc7b6"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.946470 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb" exitCode=0 Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.946517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.946562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerStarted","Data":"e62850894dbd9194b41f959e031d7196b09bcb44499723f230bbfd1016de3820"} Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.956412 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.969025 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.984587 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:56 crc kubenswrapper[4763]: I1205 11:48:56.998136 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.010894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.030346 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.048083 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.064733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.077999 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.091018 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.103085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.121243 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.134944 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.148788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.170186 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.180897 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.194175 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.212848 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.227203 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.235957 4763 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.236253 4763 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238372 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.238714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.254312 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.257913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.257973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.258011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.258042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.258066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.260165 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.281753 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.285522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.285555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.285564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.285578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.285586 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.295420 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.313658 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.320563 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gxbp8"] Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.321381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.328944 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329233 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329298 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329363 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.329789 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.349161 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.359399 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.366959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.366990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.367000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.367014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.367024 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.374146 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.385316 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.386182 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.386363 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.388609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.388645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.388657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.388674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.388686 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.394393 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.411198 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.417429 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.419640 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.442955 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.443039 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-serviceca\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.443065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.443114 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:48:59.44309865 +0000 UTC m=+23.935813373 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.443145 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.443223 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:59.44320148 +0000 UTC m=+23.935916203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.443254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-host\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.443318 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.443346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s79r\" (UniqueName: \"kubernetes.io/projected/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-kube-api-access-9s79r\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.443444 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.443472 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:59.443466282 +0000 UTC m=+23.936181005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.457434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.478722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.490569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.490611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.490621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.490670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.490681 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.499437 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.517031 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.531368 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.544307 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.544359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-serviceca\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.544409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-host\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.544439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.544478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s79r\" (UniqueName: \"kubernetes.io/projected/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-kube-api-access-9s79r\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.544561 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.544598 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.544613 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.544684 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:59.544655605 +0000 UTC m=+24.037370388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.545357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-host\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.545450 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.545471 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.545481 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.545512 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:48:59.54550131 +0000 UTC m=+24.038216103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.545881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-serviceca\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.546081 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.570085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s79r\" (UniqueName: \"kubernetes.io/projected/ae86f1f4-06e2-47ef-80e3-f692e44cce3f-kube-api-access-9s79r\") pod \"node-ca-gxbp8\" (UID: \"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\") " pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.593275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.593308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.593324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.593337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.593346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.603417 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.640441 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxbp8" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.641240 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.689026 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.697098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.697152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.697161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.697174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.697186 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.724751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.760623 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.783822 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.783902 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.783954 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.783963 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.784066 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:48:57 crc kubenswrapper[4763]: E1205 11:48:57.784132 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.789801 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.790626 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.791789 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.792408 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.793829 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.794619 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.795618 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.797055 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.798161 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799314 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799305 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.799918 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.800959 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.801501 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.802098 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.803104 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.803622 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.804565 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.805008 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.805288 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.805900 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.806874 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.807318 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.808428 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.808860 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.809816 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.810240 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.810851 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.811868 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.812321 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.813251 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.813714 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.815086 4763 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.815187 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.816826 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.817751 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.818196 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.819787 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.820426 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.821627 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.822261 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.823333 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.823820 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.824728 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.825676 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.826386 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.827254 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.827840 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.828846 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.829567 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.830638 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.831442 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.832060 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.833125 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.833692 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.834593 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.851363 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.883309 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.902203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.902254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.902264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.902283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.902292 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:57Z","lastTransitionTime":"2025-12-05T11:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.923276 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.952263 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552" exitCode=0 Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.952352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.954841 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956863 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956902 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.956930 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.958194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxbp8" event={"ID":"ae86f1f4-06e2-47ef-80e3-f692e44cce3f","Type":"ContainerStarted","Data":"e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.958237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxbp8" event={"ID":"ae86f1f4-06e2-47ef-80e3-f692e44cce3f","Type":"ContainerStarted","Data":"dd4e65a20164ae85b83ccd5c75e813e6d5d846167707ad58b7bb6885c12f85fc"} Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.959215 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.962596 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:57Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:57 crc kubenswrapper[4763]: I1205 11:48:57.981099 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.005659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.005697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.005709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.005727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.005737 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.024234 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.061186 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.104842 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.108795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.108838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.108849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.108868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.108881 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.142637 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.186231 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.211516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.211553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.211562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.211578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.211588 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.225987 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.263805 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.303191 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.313739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.313796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.313808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.313829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.313850 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.346742 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.383265 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.416916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.416960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.416971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.416988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.416997 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.429460 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.463060 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.502749 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.519352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.519387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.519396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.519410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.519420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.543950 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.583130 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.621316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.621355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.621363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.621380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.621390 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.623434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.663372 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.720696 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.723680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.723722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.723731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.723748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.723776 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.747114 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.787434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827079 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:58Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.827271 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.929883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.929921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.929930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.929947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.929958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:58Z","lastTransitionTime":"2025-12-05T11:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.963818 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c" exitCode=0 Dec 05 11:48:58 crc kubenswrapper[4763]: I1205 11:48:58.963892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.032394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.032437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.032447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.032466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.032479 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.135846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.135890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.135902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.135919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.135932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.239652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.239698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.239710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.239726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.239739 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.342549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.342613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.342632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.342658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.342676 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.445284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.445334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.445346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.445364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.445377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.462546 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.462732 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:49:03.462700343 +0000 UTC m=+27.955415086 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.462831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.462883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.462980 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.463043 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:03.463031245 +0000 UTC m=+27.955745988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.463139 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.463204 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:03.463187966 +0000 UTC m=+27.955902699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.547504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.547549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.547559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.547576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.547588 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.563813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.563871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564017 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564049 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564061 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564105 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:03.564090267 +0000 UTC m=+28.056804990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564018 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564161 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564179 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.564230 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:03.564214248 +0000 UTC m=+28.056928981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.651221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.651288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.651305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.651330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.651347 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.754118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.754147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.754156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.754168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.754177 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.783803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.783803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.783981 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.784004 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.784052 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:48:59 crc kubenswrapper[4763]: E1205 11:48:59.784094 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.856253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.856288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.856299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.856315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.856327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.928992 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.940881 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.951527 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.958989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.959014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.959021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.959035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.959043 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:48:59Z","lastTransitionTime":"2025-12-05T11:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.966296 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.978316 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:48:59 crc kubenswrapper[4763]: I1205 11:48:59.990449 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:48:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.010574 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.024471 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.036707 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.065184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.065425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.065513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.065578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.065632 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.069631 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.094544 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.113288 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.123761 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.139070 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.156389 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.167861 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.168567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.168600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.168609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.168624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.168633 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.178592 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.190806 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.271988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.272041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.272060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.272080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.272100 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.374558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.374621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.374641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.374665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.374683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.478698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.478785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.478808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.478838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.478852 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.581637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.581700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.581718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.581745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.581790 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.684230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.684263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.684271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.684286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.684296 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.786535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.786593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.786612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.786631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.786648 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.889152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.889197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.889206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.889220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.889230 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.978563 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622" exitCode=0 Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.978647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.984658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.986618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.990878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.990919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.990932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.990951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.990962 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:00Z","lastTransitionTime":"2025-12-05T11:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:00 crc kubenswrapper[4763]: I1205 11:49:00.996987 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.017798 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.030412 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.041788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.061233 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.079784 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.092126 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.095209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.095251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.095289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.095328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.095340 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.101916 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.113758 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.123425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.133830 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.145313 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.155646 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.170650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.182552 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198508 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.198731 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.216494 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.230832 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.243652 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.254912 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.268948 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.282442 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.300532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.300560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.300569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.300581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.300591 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.309616 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.320978 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.334579 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.345273 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.362249 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.374084 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.389027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.398015 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.402624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.402676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.402692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.402711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.402726 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.504848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.504877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.504887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.504900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.504909 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.607363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.607406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.607417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.607436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.607448 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.709959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.709990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.710002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.710017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.710030 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.783007 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.783089 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.783007 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:01 crc kubenswrapper[4763]: E1205 11:49:01.783188 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:01 crc kubenswrapper[4763]: E1205 11:49:01.783274 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:01 crc kubenswrapper[4763]: E1205 11:49:01.783358 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.812567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.812623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.812632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.812647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.812657 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.914855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.914900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.914913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.914927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.914940 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:01Z","lastTransitionTime":"2025-12-05T11:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.993741 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9" exitCode=0 Dec 05 11:49:01 crc kubenswrapper[4763]: I1205 11:49:01.993798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.006857 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.017212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.017461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.017593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.017688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.017804 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.022997 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.044899 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.061997 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.075257 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.087792 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.103446 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.115575 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.120173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.120203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.120213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.120230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.120241 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.130083 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.147975 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.159751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.169700 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.179605 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.189382 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.197652 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.222566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.222602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.222610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.222624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.222633 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.325605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.325665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.325679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.325697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.325709 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.429168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.429216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.429228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.429245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.429257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.531584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.531630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.531643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.531661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.531674 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.634363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.634970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.634996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.635024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.635133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.739185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.739224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.739237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.739262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.739276 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.841787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.841827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.841838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.841857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.841868 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.944480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.944533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.944548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.944568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:02 crc kubenswrapper[4763]: I1205 11:49:02.944580 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:02Z","lastTransitionTime":"2025-12-05T11:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.000537 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb92473b-8e13-46cd-9c26-9ef67d1d6e5f" containerID="4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0" exitCode=0 Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.000632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerDied","Data":"4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.009566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.009964 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.009979 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.039134 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.039342 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.040033 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.048857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.049193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.049213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.049237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.049254 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.055729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.070334 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.084711 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.100578 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.111529 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.122071 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.131398 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.143475 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.151167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.151200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.151210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.151222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.151231 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.156188 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.167954 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.180939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.198690 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.210996 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.222344 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.233052 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.244343 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.253236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.253276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.253285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.253303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.253313 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.254540 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.268028 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.279269 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.288582 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.297499 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.317103 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.327981 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.338006 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.354514 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.355890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.355935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.355944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.355959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.355970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.368067 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.378012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.388945 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.401657 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:03Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.458395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.458451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.458471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.458496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.458516 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.504681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.504847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.504900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.504940 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.504917737 +0000 UTC m=+35.997632470 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.504970 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.504972 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.505010 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.505001058 +0000 UTC m=+35.997715781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.505040 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.505021198 +0000 UTC m=+35.997735941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.561405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.561462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.561478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.561496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.561505 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.606262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.606382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.606621 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.606659 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.606686 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.607016 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.606986666 +0000 UTC m=+36.099701429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.607494 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.607667 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.607715 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.607831 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.60775785 +0000 UTC m=+36.100472573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.664483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.664544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.664563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.664584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.664600 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.767070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.767114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.767126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.767144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.767162 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.783592 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.783729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.783861 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.783910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.784047 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:03 crc kubenswrapper[4763]: E1205 11:49:03.784128 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.869941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.869977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.869985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.870000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.870009 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.972494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.972531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.972540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.972555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:03 crc kubenswrapper[4763]: I1205 11:49:03.972565 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:03Z","lastTransitionTime":"2025-12-05T11:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.017837 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" event={"ID":"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f","Type":"ContainerStarted","Data":"c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.017875 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.038942 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.055311 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.068606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.074882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.074922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.074930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.074945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.074954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.080582 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.097692 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.110481 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.125253 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.134363 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.148422 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.159468 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.173821 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.177599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.177631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.177640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.177655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.177664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.186916 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.205854 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.220492 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.232873 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:04Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.280121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.280166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.280179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.280203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.280215 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.382377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.382409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.382432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.382447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.382457 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.484586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.484628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.484640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.484656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.484669 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.586815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.586861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.586876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.586897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.586912 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.689664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.689741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.689768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.689786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.689797 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.792537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.792566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.792576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.792590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.792600 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.895375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.895400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.895410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.895422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.895431 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.997374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.997414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.997423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.997437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:04 crc kubenswrapper[4763]: I1205 11:49:04.997449 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:04Z","lastTransitionTime":"2025-12-05T11:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.021334 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.100385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.100426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.100440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.100456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.100468 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.202886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.202926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.202939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.202957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.202970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.305840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.305884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.305898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.305916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.305928 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.408205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.408255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.408270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.408293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.408309 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.528121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.528635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.528835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.529067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.529274 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.632602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.632682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.632705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.632740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.632805 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.736142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.736184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.736192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.736206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.736216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.783184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.783254 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:05 crc kubenswrapper[4763]: E1205 11:49:05.783372 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.783538 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:05 crc kubenswrapper[4763]: E1205 11:49:05.783616 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:05 crc kubenswrapper[4763]: E1205 11:49:05.783849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.803259 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.821040 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.843718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.843848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.843878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.843911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.843933 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.856305 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.874789 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.886182 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.896027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.911758 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.927721 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.938668 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.945848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.945885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.945914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.945932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.945942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:05Z","lastTransitionTime":"2025-12-05T11:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.947609 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.960119 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.972202 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:05 crc kubenswrapper[4763]: I1205 11:49:05.986510 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:05.999992 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:05Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.017848 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.024775 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/0.log" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.027176 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053" exitCode=1 Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.027239 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.027830 4763 scope.go:117] "RemoveContainer" containerID="029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.042857 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.048230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.048264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.048273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.048289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.048299 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.057326 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.069299 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.080492 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.099869 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.113868 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.126072 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.146836 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.150911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.150940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.150950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.150968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.150977 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.159722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.169742 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.180177 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.192626 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.203781 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.216601 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.225425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:06Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.253088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.253132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.253145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.253162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.253176 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.355132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.355161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.355169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.355182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.355190 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.457028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.457065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.457077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.457094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.457106 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.559082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.559131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.559143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.559162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.559176 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.661356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.661393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.661403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.661418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.661428 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.763650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.763700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.763717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.763740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.763755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.865563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.865596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.865606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.865619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.865632 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.968190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.968222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.968229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.968243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:06 crc kubenswrapper[4763]: I1205 11:49:06.968251 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:06Z","lastTransitionTime":"2025-12-05T11:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.032816 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/0.log" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.037335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.037679 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.062136 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.070829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.070871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.070880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.070893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.070902 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.072838 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.086375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.101932 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.133240 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.152865 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.169913 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.174408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.174443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.174459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.174477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.174491 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.194589 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.206995 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.218534 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.228840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.244898 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.259479 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.271842 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.279195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.279236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.279247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.279264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.279277 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.284059 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.382245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.382313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.382336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.382363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.382384 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.485203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.485280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.485302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.485332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.485350 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.527273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.527349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.527368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.527393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.527412 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.548850 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.554624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.554686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.554696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.554718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.554786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.574170 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.578323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.578363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.578375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.578396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.578411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.592263 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.596281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.596313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.596323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.596341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.596356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.611030 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.615019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.615057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.615070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.615090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.615104 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.636720 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:07Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.636920 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.638627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.638690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.638709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.638735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.638752 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.741882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.741944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.741958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.741975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.741988 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.784063 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.784128 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.784272 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.784288 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.784479 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:07 crc kubenswrapper[4763]: E1205 11:49:07.784687 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.844155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.844233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.844255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.844282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.844301 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.947296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.947346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.947362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.947389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:07 crc kubenswrapper[4763]: I1205 11:49:07.947405 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:07Z","lastTransitionTime":"2025-12-05T11:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.049450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.049500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.049512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.049528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.049540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.152138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.152190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.152202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.152219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.152235 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.254476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.254506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.254514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.254550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.254561 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.357603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.357664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.357674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.357716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.357788 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.461247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.461290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.461305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.461322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.461333 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.564631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.564678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.564691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.564713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.564726 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.669052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.669125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.669184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.669227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.669238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.772272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.772532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.772618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.772715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.772885 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.875847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.875900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.875917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.875940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.875957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.980036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.980127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.980153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.980184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:08 crc kubenswrapper[4763]: I1205 11:49:08.980208 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:08Z","lastTransitionTime":"2025-12-05T11:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.047403 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/1.log" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.048131 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/0.log" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.051708 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39" exitCode=1 Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.051800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.051885 4763 scope.go:117] "RemoveContainer" containerID="029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.053067 4763 scope.go:117] "RemoveContainer" containerID="c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39" Dec 05 11:49:09 crc kubenswrapper[4763]: E1205 11:49:09.053370 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.083169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.083218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.083231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.083250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.083263 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.084879 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.106138 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.123132 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.126125 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb"] Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.126960 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.130574 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.130997 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.139464 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.154060 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.158135 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4jd\" (UniqueName: \"kubernetes.io/projected/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-kube-api-access-9g4jd\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.158312 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.158507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.158624 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.169446 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.186108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.186180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.186194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.186213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.186226 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.188646 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.214322 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.238349 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.254226 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.259929 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4jd\" (UniqueName: \"kubernetes.io/projected/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-kube-api-access-9g4jd\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.260053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.260134 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.260173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.261640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.261792 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.268294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.280452 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.290011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.290045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.290057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.290075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.290087 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.293121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4jd\" (UniqueName: \"kubernetes.io/projected/d71e854a-fd6a-4efa-9cf4-a5dc75a1901f-kube-api-access-9g4jd\") pod \"ovnkube-control-plane-749d76644c-ptcsb\" (UID: \"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.305743 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.320643 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.337051 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.351501 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.367532 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.385205 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.392057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.392097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.392109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.392127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.392140 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.408192 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.421581 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.435168 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.449546 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.456376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: W1205 11:49:09.464590 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd71e854a_fd6a_4efa_9cf4_a5dc75a1901f.slice/crio-320067f64e36777799f243a1d8e3a27ee97c06b813c5ec9eabec6664c20dc42e WatchSource:0}: Error finding container 320067f64e36777799f243a1d8e3a27ee97c06b813c5ec9eabec6664c20dc42e: Status 404 returned error can't find the container with id 320067f64e36777799f243a1d8e3a27ee97c06b813c5ec9eabec6664c20dc42e Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.473664 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.489432 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.494825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.494864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.494873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.494893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.494903 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.507159 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.520855 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.539573 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.554712 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.568579 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.580875 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.592824 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.596857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.596899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.596911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.596931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.596945 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.605302 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:09Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.699442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.699487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.699498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.699515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.699528 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.783628 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.783687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:09 crc kubenswrapper[4763]: E1205 11:49:09.783789 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.783687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:09 crc kubenswrapper[4763]: E1205 11:49:09.783885 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:09 crc kubenswrapper[4763]: E1205 11:49:09.784058 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.801949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.801995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.802008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.802029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.802047 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.904300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.904343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.904356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.904373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:09 crc kubenswrapper[4763]: I1205 11:49:09.904385 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:09Z","lastTransitionTime":"2025-12-05T11:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.006906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.006936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.006948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.006967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.006980 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.055799 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" event={"ID":"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f","Type":"ContainerStarted","Data":"d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.055840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" event={"ID":"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f","Type":"ContainerStarted","Data":"40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.055849 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" event={"ID":"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f","Type":"ContainerStarted","Data":"320067f64e36777799f243a1d8e3a27ee97c06b813c5ec9eabec6664c20dc42e"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.058640 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/1.log" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.075140 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.087777 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.099312 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.110278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.110328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.110338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.110359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.110374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.113297 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.126043 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.138472 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.151228 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.169740 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.184175 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.194566 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.205120 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.213057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.213087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.213097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.213112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.213123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.232560 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.235660 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-x45qv"] Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.236362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: E1205 11:49:10.236443 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.245305 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.257433 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.269250 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.270541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hszt\" (UniqueName: \"kubernetes.io/projected/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-kube-api-access-9hszt\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.270634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.290023 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.301515 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.315151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.315177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.315185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.315199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.315210 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.317137 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.338911 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.353305 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.371850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.371903 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hszt\" (UniqueName: \"kubernetes.io/projected/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-kube-api-access-9hszt\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: E1205 11:49:10.372361 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:10 crc kubenswrapper[4763]: E1205 11:49:10.372417 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:10.872401997 +0000 UTC m=+35.365116730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.382391 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.399706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hszt\" (UniqueName: \"kubernetes.io/projected/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-kube-api-access-9hszt\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.418001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.418032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.418040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.418054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.418062 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.419135 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.437504 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.456603 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.467573 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.481504 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.494166 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.514954 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.520381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.520409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.520417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.520430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.520439 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.529118 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.540521 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.550883 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.564542 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.582203 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:10Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.622343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.622384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.622396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.622411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.622420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.724564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.724611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.724628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.724644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.724655 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.826827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.826885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.826901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.826919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.826932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.876317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:10 crc kubenswrapper[4763]: E1205 11:49:10.876515 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:10 crc kubenswrapper[4763]: E1205 11:49:10.876597 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:11.876578444 +0000 UTC m=+36.369293177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.931047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.931635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.931647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.931673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:10 crc kubenswrapper[4763]: I1205 11:49:10.931687 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:10Z","lastTransitionTime":"2025-12-05T11:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.034814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.034850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.034859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.034873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.034884 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.137084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.137132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.137148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.137166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.137180 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.239823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.239869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.239880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.239899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.239911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.341802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.341868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.341885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.341910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.341925 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.444648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.444741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.444814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.444868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.444895 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.548066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.548140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.548158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.548185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.548205 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.584055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.584343 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:49:27.584268944 +0000 UTC m=+52.076983747 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.584414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.584519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.584655 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.584707 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.584750 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:27.584723617 +0000 UTC m=+52.077438380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.584815 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:27.584795817 +0000 UTC m=+52.077510540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.650792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.650832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.650842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.650858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.650868 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.685846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.685942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686085 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686102 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686114 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686114 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686167 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:27.686151132 +0000 UTC m=+52.178865855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686167 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686201 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.686293 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:27.686262422 +0000 UTC m=+52.178977225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.752680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.752734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.752752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.752810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.752833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.783675 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.783820 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.783887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.783813 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.783987 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.784135 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.784226 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.784278 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.856158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.856206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.856222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.856246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.856265 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.888246 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.888409 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: E1205 11:49:11.888474 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:13.888455208 +0000 UTC m=+38.381169941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.958835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.958918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.958931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.958951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:11 crc kubenswrapper[4763]: I1205 11:49:11.958963 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:11Z","lastTransitionTime":"2025-12-05T11:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.002038 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.024469 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.046191 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.057993 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.061669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.061716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.061733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.061753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.061803 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.084336 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.098737 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.109228 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.119591 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.131751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.144529 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.156845 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.163828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.163864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.163875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.163893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.163909 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.169075 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.180147 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.195838 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.210296 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.223100 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.235272 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.254043 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029931f3223b6e6bdb93594f23ab9d28df4dcbf5cfa4f28e53a8636d05db0053\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:05Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 11:49:05.268109 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:05.268124 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:05.268135 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:05.268140 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:05.268157 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:05.268167 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:05.268176 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:05.268199 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:05.268208 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:05.268186 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:05.268217 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:05.268219 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:05.268233 6052 factory.go:656] Stopping watch factory\\\\nI1205 11:49:05.268247 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:05.268256 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:12Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.267480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.267520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.267531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.267547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.267560 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.370637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.370696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.370713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.370739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.370800 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.472843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.472880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.472889 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.472904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.472915 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.575922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.575969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.575978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.575993 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.576005 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.679234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.679280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.679290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.679309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.679321 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.781915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.781997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.782021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.782097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.782123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.885663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.885741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.885801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.885831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.885850 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.988927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.988983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.988997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.989021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:12 crc kubenswrapper[4763]: I1205 11:49:12.989034 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:12Z","lastTransitionTime":"2025-12-05T11:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.091936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.091992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.092009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.092033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.092050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.195658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.195722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.195734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.195757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.195806 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.298471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.298530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.298549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.298574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.298592 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.401116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.401155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.401171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.401191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.401204 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.505698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.505785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.505799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.505820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.505836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.609203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.609256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.609283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.609306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.609324 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.713027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.713102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.713125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.713305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.713327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.783329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.783329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.783555 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.783597 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.783716 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.783356 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.783886 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.784106 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.816386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.816459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.816479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.816509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.816528 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.915339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.915559 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:13 crc kubenswrapper[4763]: E1205 11:49:13.915631 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:17.915608147 +0000 UTC m=+42.408322900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.919646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.920369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.920412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.920456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:13 crc kubenswrapper[4763]: I1205 11:49:13.920466 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:13Z","lastTransitionTime":"2025-12-05T11:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.024018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.024073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.024082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.024099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.024108 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.126681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.126749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.126817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.126847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.126869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.229629 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.229698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.229708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.229731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.229746 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.332923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.332999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.333019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.333045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.333059 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.436253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.436354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.436375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.436902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.437124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.540624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.540690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.540700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.540717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.540727 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.565180 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.566834 4763 scope.go:117] "RemoveContainer" containerID="c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39" Dec 05 11:49:14 crc kubenswrapper[4763]: E1205 11:49:14.567133 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.594836 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.612147 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.626368 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.643694 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.643729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.643790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.643801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.643819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.644014 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.662348 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.679432 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.699745 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.715250 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.728520 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.742672 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.750650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.750688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.750704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.750726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.750743 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.759197 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.771987 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.784536 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.801666 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.813752 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.827381 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.841696 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:14Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.853480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.853517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.853527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.853544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.853559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.956728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.956832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.956859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.956886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:14 crc kubenswrapper[4763]: I1205 11:49:14.956910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:14Z","lastTransitionTime":"2025-12-05T11:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.059170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.059244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.059268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.059297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.059318 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.162268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.162308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.162318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.162345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.162355 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.265378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.265521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.265556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.265688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.265826 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.369290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.369350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.369367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.369392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.369413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.473291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.473365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.473386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.473413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.473433 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.576327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.576409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.576429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.576453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.576470 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.678699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.678782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.678795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.678818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.678832 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.780872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.780946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.780960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.780987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.781002 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.783252 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.783297 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.783303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.783452 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:15 crc kubenswrapper[4763]: E1205 11:49:15.783441 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:15 crc kubenswrapper[4763]: E1205 11:49:15.783535 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:15 crc kubenswrapper[4763]: E1205 11:49:15.783587 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:15 crc kubenswrapper[4763]: E1205 11:49:15.783695 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.804894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.818649 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.836618 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.866821 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.883639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.883702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.883719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.883744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.883796 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.888850 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.907717 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.928175 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.945385 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.961084 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.976629 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.986107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.986171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.986187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.986212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.986230 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:15Z","lastTransitionTime":"2025-12-05T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:15 crc kubenswrapper[4763]: I1205 11:49:15.988898 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:15Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.004470 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.029004 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.046713 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.058513 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.070320 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.087841 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:16Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.088420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.088449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.088460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.088479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.088488 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.190579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.190621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.190636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.190656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.190672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.293649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.293703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.293725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.293755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.293815 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.396700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.396750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.396847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.396876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.396896 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.500458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.500523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.500540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.500562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.500579 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.602611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.602655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.602669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.602687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.602701 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.706139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.706193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.706210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.706234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.706251 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.808370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.808438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.808461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.808490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.808512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.911531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.911581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.911596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.911615 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:16 crc kubenswrapper[4763]: I1205 11:49:16.911626 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:16Z","lastTransitionTime":"2025-12-05T11:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.015559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.015648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.015691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.015725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.015840 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.119910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.120001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.120014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.120044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.120062 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.223843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.223943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.223967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.223999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.224020 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.327732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.327791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.327805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.327821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.327832 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.434689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.434791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.434813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.434841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.434870 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.538381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.538436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.538454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.538519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.538537 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.641622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.641684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.641705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.641732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.641751 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.745566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.745900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.746034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.746147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.746262 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.783315 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.783456 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.783581 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.783630 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.783856 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.783853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.783918 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.784583 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.791802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.791880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.791899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.791923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.791947 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.815279 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:17Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.820934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.820992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.821015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.821044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.821066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.838204 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:17Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.843630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.843900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.844141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.844305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.844459 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.861754 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:17Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.866263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.866412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.866485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.866561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.866629 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.878215 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:17Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.882291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.882410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.882501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.882735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.883025 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.895908 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:17Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.896264 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.897654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.897694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.897703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.897719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.897729 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:17Z","lastTransitionTime":"2025-12-05T11:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:17 crc kubenswrapper[4763]: I1205 11:49:17.957021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.957379 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:17 crc kubenswrapper[4763]: E1205 11:49:17.957544 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:25.957507446 +0000 UTC m=+50.450222239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.000585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.000625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.000638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.000652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.000664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.102601 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.102655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.102675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.102700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.102718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.205934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.205998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.206016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.206042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.206072 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.309003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.309060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.309070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.309089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.309098 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.411009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.411555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.411622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.411736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.411826 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.515852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.516245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.516392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.516532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.516689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.619711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.620031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.620094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.620165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.620227 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.722835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.722911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.722938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.722969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.722992 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.826483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.826706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.826790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.826888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.826954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.930158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.930238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.930264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.930296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:18 crc kubenswrapper[4763]: I1205 11:49:18.930315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:18Z","lastTransitionTime":"2025-12-05T11:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.036986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.037687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.037815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.037902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.037990 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.141018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.141078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.141094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.141117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.141136 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.244243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.244325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.244336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.244361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.244373 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.346807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.346848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.346860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.346877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.346888 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.449498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.449547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.449560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.449579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.449592 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.552245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.552314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.552328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.552345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.552357 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.656206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.656289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.656324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.656354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.656375 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.759495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.759569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.759586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.759610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.759627 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.783373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.783496 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.783505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:19 crc kubenswrapper[4763]: E1205 11:49:19.783583 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:19 crc kubenswrapper[4763]: E1205 11:49:19.783689 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.783710 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:19 crc kubenswrapper[4763]: E1205 11:49:19.783828 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:19 crc kubenswrapper[4763]: E1205 11:49:19.783928 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.862034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.862365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.862380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.862395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.862408 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.965046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.965100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.965114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.965128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:19 crc kubenswrapper[4763]: I1205 11:49:19.965138 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:19Z","lastTransitionTime":"2025-12-05T11:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.069093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.069244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.069310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.069338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.069397 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.172063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.172108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.172118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.172134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.172150 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.274520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.274566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.274582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.274604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.274622 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.377799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.377858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.377875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.377898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.377915 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.481663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.481731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.481751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.481829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.481858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.585339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.585376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.585387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.585418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.585429 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.687821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.687882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.687901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.687929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.687947 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.790536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.790591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.790603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.790619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.790631 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.893455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.893526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.893537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.893555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.893566 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.997819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.997913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.997938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.997975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:20 crc kubenswrapper[4763]: I1205 11:49:20.998000 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:20Z","lastTransitionTime":"2025-12-05T11:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.101097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.101173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.101196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.101228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.101248 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.204083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.204148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.204166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.204189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.204207 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.307323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.307383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.307396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.307414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.307426 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.410636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.410692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.410704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.410724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.410737 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.513531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.513595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.513609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.513628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.513645 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.616339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.616376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.616386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.616401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.616411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.719294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.719368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.719387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.719412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.719430 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.783604 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.783812 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:21 crc kubenswrapper[4763]: E1205 11:49:21.783979 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.784016 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.784119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:21 crc kubenswrapper[4763]: E1205 11:49:21.784311 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:21 crc kubenswrapper[4763]: E1205 11:49:21.784531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:21 crc kubenswrapper[4763]: E1205 11:49:21.784671 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.854368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.854428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.854444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.854469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.854486 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.957468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.957524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.957538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.957561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:21 crc kubenswrapper[4763]: I1205 11:49:21.957579 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:21Z","lastTransitionTime":"2025-12-05T11:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.061051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.061096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.061107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.061122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.061133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.163096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.163145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.163161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.163179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.163189 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.265561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.265605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.265619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.265638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.265654 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.368519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.368579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.368593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.368611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.368629 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.471698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.471747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.471823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.471853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.471870 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.575215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.575272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.575289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.575313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.575332 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.677477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.677528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.677541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.677562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.677580 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.779926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.780003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.780025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.780054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.780075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.883055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.883104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.883117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.883133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.883147 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.986268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.986338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.986354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.986372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:22 crc kubenswrapper[4763]: I1205 11:49:22.986412 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:22Z","lastTransitionTime":"2025-12-05T11:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.089507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.089549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.089563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.089582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.089596 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.191607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.191672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.191695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.191727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.191751 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.294608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.294669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.294686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.294710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.294728 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.396863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.396927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.396951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.396974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.396992 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.500010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.500075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.500093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.500122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.500161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.603401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.603436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.603446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.603462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.603473 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.706564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.706625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.706638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.706657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.706670 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.783925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.783972 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.784026 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.784147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:23 crc kubenswrapper[4763]: E1205 11:49:23.784136 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:23 crc kubenswrapper[4763]: E1205 11:49:23.784261 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:23 crc kubenswrapper[4763]: E1205 11:49:23.784312 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:23 crc kubenswrapper[4763]: E1205 11:49:23.784353 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.809314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.809358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.809367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.809387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.809397 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.912888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.912938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.912946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.912968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:23 crc kubenswrapper[4763]: I1205 11:49:23.912979 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:23Z","lastTransitionTime":"2025-12-05T11:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.016856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.016913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.016923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.016951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.016965 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.119441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.119507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.119519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.119548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.119564 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.223172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.223254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.223275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.223305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.223329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.326319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.326379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.326396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.326420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.326438 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.429240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.429302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.429327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.429357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.429378 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.532140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.532224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.532247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.532277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.532304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.635857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.635915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.635935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.635963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.635980 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.738347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.738437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.738455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.738480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.738499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.841087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.841150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.841167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.841191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.841208 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.944056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.944094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.944117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.944141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:24 crc kubenswrapper[4763]: I1205 11:49:24.944154 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:24Z","lastTransitionTime":"2025-12-05T11:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.046855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.046911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.046932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.046960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.046982 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.150008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.150065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.150081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.150104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.150125 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.252502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.252549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.252584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.252605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.252619 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.355979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.356056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.356087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.356141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.356160 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.459422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.459507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.459524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.459549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.459568 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.562683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.562754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.562807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.562839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.562862 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.665603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.665664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.665683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.665707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.665724 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.769003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.769069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.769091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.769118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.769143 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.783423 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.783474 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.783524 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:25 crc kubenswrapper[4763]: E1205 11:49:25.783591 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.783609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:25 crc kubenswrapper[4763]: E1205 11:49:25.783754 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:25 crc kubenswrapper[4763]: E1205 11:49:25.784124 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:25 crc kubenswrapper[4763]: E1205 11:49:25.784245 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.802547 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.813648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.817134 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.834270 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.850902 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.867285 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.872554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.872611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.872626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.872647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.872666 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.900010 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.922633 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.939650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.954387 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.988149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.988185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.988195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.988211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.988221 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:25Z","lastTransitionTime":"2025-12-05T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:25 crc kubenswrapper[4763]: I1205 11:49:25.999138 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:25Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.020451 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.030953 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.040075 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.046517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:26 crc kubenswrapper[4763]: E1205 11:49:26.046654 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:26 crc kubenswrapper[4763]: E1205 11:49:26.046728 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:49:42.046713399 +0000 UTC m=+66.539428132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.053848 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.066839 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.077189 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.086612 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.089801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.089836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.089844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.089859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.089869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.097634 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.109660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.121001 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.130475 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.139424 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.149218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.158226 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.168754 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.180558 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.191460 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.195622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.195660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.195670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.195691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.195701 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.220324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.232509 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.244451 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.254706 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.274080 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.289886 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.297619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.297654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.297664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.297680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.297693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.301324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.313632 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.326671 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:26Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.399861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.399914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.399934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.400257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.400292 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.504099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.504165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.504187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.504217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.504242 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.606546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.606611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.606634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.606664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.606684 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.710245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.710306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.710322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.710349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.710366 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.813130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.813193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.813211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.813235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.813253 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.917274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.917334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.917354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.917381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:26 crc kubenswrapper[4763]: I1205 11:49:26.917398 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:26Z","lastTransitionTime":"2025-12-05T11:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.020678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.020801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.020826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.020862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.020884 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.123754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.123861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.123878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.123899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.123917 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.226457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.226499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.226510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.226528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.226540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.329833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.329886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.329898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.329920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.329934 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.432504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.432529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.432558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.432571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.432581 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.535510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.535616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.535635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.535661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.535681 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.639230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.639311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.639329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.639355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.639374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.664081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.664252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.664280 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:49:59.664248685 +0000 UTC m=+84.156963418 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.664330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.664448 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.664474 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.664523 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:59.664504272 +0000 UTC m=+84.157219005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.664632 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:59.664558993 +0000 UTC m=+84.157273756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.742660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.742713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.742726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.742745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.742757 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.765674 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.765897 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.765964 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766015 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766043 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766136 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:59.766105083 +0000 UTC m=+84.258819856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766143 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766202 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766223 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.766290 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:49:59.766266427 +0000 UTC m=+84.258981200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.783800 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.783884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.783895 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.784020 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.783967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.784066 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.784146 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:27 crc kubenswrapper[4763]: E1205 11:49:27.784257 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.845811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.845848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.845858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.845875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.845887 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.949438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.949491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.949504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.949523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:27 crc kubenswrapper[4763]: I1205 11:49:27.949534 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:27Z","lastTransitionTime":"2025-12-05T11:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.006918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.007009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.007036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.007069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.007095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.022322 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:28Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.027809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.027873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.027888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.027913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.027929 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.047154 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:28Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.052374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.052455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.052474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.052536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.052555 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.073798 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:28Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.077655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.077702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.077721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.077746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.077799 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.090689 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:28Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.093314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.093338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.093347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.093360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.093370 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.104455 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:28Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:28 crc kubenswrapper[4763]: E1205 11:49:28.104586 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.105919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.105939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.105947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.105960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.105970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.209226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.209273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.209290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.209311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.209324 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.314111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.314166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.314177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.314194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.314205 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.417245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.417316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.417334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.417357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.417374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.519476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.519530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.519542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.519564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.519577 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.622029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.622071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.622086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.622104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.622119 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.724211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.724273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.724291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.724315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.724336 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.784201 4763 scope.go:117] "RemoveContainer" containerID="c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.827398 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.827813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.827835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.827862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.827884 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.931096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.931138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.931173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.931191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:28 crc kubenswrapper[4763]: I1205 11:49:28.931203 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:28Z","lastTransitionTime":"2025-12-05T11:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.033587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.033629 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.033638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.033654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.033664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.122273 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/1.log" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.124747 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.125264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.135900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.135937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.135946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.135964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.135974 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.142926 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.187858 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.202913 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.236779 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.238081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.238111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.238120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.238136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.238147 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.250940 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.261108 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.269107 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.280984 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.294791 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.305672 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.315292 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.326302 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.340044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.340086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.340095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.340111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.340121 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.342393 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.354824 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.366703 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.377866 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.389445 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.406250 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:29Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.442076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.442108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.442116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.442129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.442137 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.544245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.544305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.544320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.544342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.544357 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.646570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.646622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.646633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.646654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.646665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.749171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.749214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.749225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.749240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.749250 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.783708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.783733 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.783738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.783811 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:29 crc kubenswrapper[4763]: E1205 11:49:29.783922 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:29 crc kubenswrapper[4763]: E1205 11:49:29.784026 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:29 crc kubenswrapper[4763]: E1205 11:49:29.784123 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:29 crc kubenswrapper[4763]: E1205 11:49:29.784269 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.852050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.852093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.852104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.852120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.852132 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.955053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.955100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.955115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.955133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:29 crc kubenswrapper[4763]: I1205 11:49:29.955144 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:29Z","lastTransitionTime":"2025-12-05T11:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.057459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.057525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.057537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.057584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.057597 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.136888 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/2.log" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.137967 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/1.log" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.140783 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" exitCode=1 Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.140836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.140920 4763 scope.go:117] "RemoveContainer" containerID="c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.142098 4763 scope.go:117] "RemoveContainer" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" Dec 05 11:49:30 crc kubenswrapper[4763]: E1205 11:49:30.142424 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.159084 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.160728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.160775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.160786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.160803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.160812 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.177125 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.191480 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.214538 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.232115 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.244787 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.255109 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.263210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.263235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.263242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.263254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.263263 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.273816 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.287533 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.303921 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.313921 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.323207 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.339253 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5753659f8a4501c4eb8aee2d090eae61f2d0f6d9d03baec4f2359d2fecbce39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:08Z\\\",\\\"message\\\":\\\" 6186 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 11:49:06.861136 6186 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 11:49:06.861199 6186 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 11:49:06.861223 6186 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 11:49:06.861239 6186 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 11:49:06.861255 6186 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 11:49:06.861286 6186 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 11:49:06.861292 6186 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 11:49:06.861309 6186 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 11:49:06.861327 6186 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 11:49:06.861339 6186 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 11:49:06.861363 6186 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 11:49:06.861385 6186 factory.go:656] Stopping watch factory\\\\nI1205 11:49:06.861409 6186 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 11:49:06.861432 6186 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 11:49:06.861458 6186 ovnkube.go:599] Stopped ovnkube\\\\nI1205 11:49:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.353078 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.365222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.365259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.365271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.365289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.365299 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.367563 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.383923 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.398677 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.410462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:30Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.468416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.468453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.468462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.468477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.468486 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.572085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.572144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.572161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.572184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.572200 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.675420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.675475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.675493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.675518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.675536 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.778917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.778999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.779025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.779054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.779073 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.881836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.881931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.881969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.881999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.882020 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.985048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.985145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.985189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.985220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:30 crc kubenswrapper[4763]: I1205 11:49:30.985244 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:30Z","lastTransitionTime":"2025-12-05T11:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.088036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.088086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.088104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.088126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.088142 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.147582 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/2.log" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.153571 4763 scope.go:117] "RemoveContainer" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" Dec 05 11:49:31 crc kubenswrapper[4763]: E1205 11:49:31.153870 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.175227 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.192012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.192067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.192083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.192105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.192119 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.193407 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.205847 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.222062 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.252522 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.267645 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.279946 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.290722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.294354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.294537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.294679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.294830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.294972 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.301750 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.315560 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.328714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.338601 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.350536 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.374005 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.388386 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.397085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.397127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.397140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.397159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.397173 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.399500 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.409417 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.418991 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:31Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.500352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.500417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.500436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.500460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.500478 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.603093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.603124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.603132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.603146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.603156 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.705526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.705571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.705588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.705610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.705628 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.784099 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.784233 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.784112 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:31 crc kubenswrapper[4763]: E1205 11:49:31.784255 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.784307 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:31 crc kubenswrapper[4763]: E1205 11:49:31.784314 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:31 crc kubenswrapper[4763]: E1205 11:49:31.784456 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:31 crc kubenswrapper[4763]: E1205 11:49:31.784685 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.808749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.808835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.808851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.808873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.808889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.911753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.911816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.911828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.911847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:31 crc kubenswrapper[4763]: I1205 11:49:31.911859 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:31Z","lastTransitionTime":"2025-12-05T11:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.014185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.014257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.014277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.014301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.014321 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.117418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.117469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.117483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.117505 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.117516 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.220148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.220217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.220236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.220264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.220284 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.323305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.323371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.323391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.323415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.323433 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.426371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.426436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.426453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.426477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.426497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.529826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.530237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.530388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.530556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.530889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.634597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.634664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.634689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.634718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.634741 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.741455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.741537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.741557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.741590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.741622 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.844528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.844639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.844666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.844697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.844720 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.947936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.947992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.948004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.948022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:32 crc kubenswrapper[4763]: I1205 11:49:32.948034 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:32Z","lastTransitionTime":"2025-12-05T11:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.051196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.051287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.051305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.051327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.051347 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.153586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.153633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.153644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.153660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.153669 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.256171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.256213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.256224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.256242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.256255 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.358703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.358753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.358788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.358807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.358820 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.461201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.461243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.461255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.461272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.461283 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.563423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.563464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.563479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.563506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.563518 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.665947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.666011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.666027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.666047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.666066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.768250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.768308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.768324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.768345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.768359 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.783689 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.783778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.783794 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.783718 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:33 crc kubenswrapper[4763]: E1205 11:49:33.783904 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:33 crc kubenswrapper[4763]: E1205 11:49:33.784107 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:33 crc kubenswrapper[4763]: E1205 11:49:33.784301 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:33 crc kubenswrapper[4763]: E1205 11:49:33.784389 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.871054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.871213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.871230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.871254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.871273 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.974653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.974710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.974718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.974733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:33 crc kubenswrapper[4763]: I1205 11:49:33.974744 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:33Z","lastTransitionTime":"2025-12-05T11:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.077819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.077894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.077914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.077941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.077960 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.180879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.180916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.180945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.180959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.180968 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.283735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.283825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.283841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.283859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.283871 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.386647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.386717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.386726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.386755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.386782 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.489924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.489980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.489996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.490018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.490031 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.593794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.593857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.593868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.593888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.593900 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.697192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.697270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.697282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.697306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.697319 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.799969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.800058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.800102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.800135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.800159 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.903164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.903207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.903217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.903235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:34 crc kubenswrapper[4763]: I1205 11:49:34.903247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:34Z","lastTransitionTime":"2025-12-05T11:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.005389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.005434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.005446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.005464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.005478 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.108471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.108542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.108558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.108584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.108602 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.212408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.212447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.212456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.212469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.212479 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.315562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.315936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.316101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.316198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.316281 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.418384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.418439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.418455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.418478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.418497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.521178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.521243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.521260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.521282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.521301 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.624051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.624117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.624140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.624167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.624187 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.726929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.726999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.727018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.727049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.727066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.783120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:35 crc kubenswrapper[4763]: E1205 11:49:35.783328 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.783359 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.783415 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.783475 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:35 crc kubenswrapper[4763]: E1205 11:49:35.783647 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:35 crc kubenswrapper[4763]: E1205 11:49:35.783797 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:35 crc kubenswrapper[4763]: E1205 11:49:35.783932 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.807402 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.822967 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.830002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.830049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.830066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.830085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.830099 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.840967 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.856159 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.870067 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.884237 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.907356 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.922795 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.933027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.933082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.933106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.933139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.933163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:35Z","lastTransitionTime":"2025-12-05T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.934080 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.945295 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.956659 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.968366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.980683 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:35 crc kubenswrapper[4763]: I1205 11:49:35.995463 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:35Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.006952 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:36Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.027333 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:36Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.035434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.035687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.035794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.035915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.035989 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.039667 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:36Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.051146 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:36Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.138353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.138690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.138837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.138938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.139034 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.242088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.242460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.242565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.242697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.242732 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.345337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.345662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.345680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.345703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.345721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.447932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.447967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.447980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.447995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.448007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.551057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.551089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.551099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.551115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.551128 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.654098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.654163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.654182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.654237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.654257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.756610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.756795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.756885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.756960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.757053 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.860979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.861030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.861043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.861061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.861073 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.963494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.963829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.963951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.964082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:36 crc kubenswrapper[4763]: I1205 11:49:36.964340 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:36Z","lastTransitionTime":"2025-12-05T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.067638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.067682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.067692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.067711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.067722 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.170961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.171027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.171048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.171073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.171095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.274050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.274127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.274150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.274179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.274203 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.377284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.377338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.377349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.377368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.377379 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.479652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.479687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.479721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.479742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.479755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.582145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.582198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.582213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.582234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.582248 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.685381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.685421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.685431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.685447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.685456 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.785015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.785052 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.785051 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.785037 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:37 crc kubenswrapper[4763]: E1205 11:49:37.785173 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:37 crc kubenswrapper[4763]: E1205 11:49:37.785216 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:37 crc kubenswrapper[4763]: E1205 11:49:37.785354 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:37 crc kubenswrapper[4763]: E1205 11:49:37.785507 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.787815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.787872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.787894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.787922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.787944 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.890251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.890294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.890302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.890316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.890325 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.993155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.993226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.993244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.993270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:37 crc kubenswrapper[4763]: I1205 11:49:37.993292 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:37Z","lastTransitionTime":"2025-12-05T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.096267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.097491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.097647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.097828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.097988 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.200323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.200730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.200990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.201188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.201395 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.305639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.305740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.306076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.306110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.306133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.392221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.392281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.392298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.392323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.392340 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.409730 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:38Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.413832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.413867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.413879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.413897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.413909 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.430852 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:38Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.434985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.435023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.435033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.435048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.435058 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.448240 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:38Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.451800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.451847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.451859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.452079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.452094 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.464903 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:38Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.468535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.468573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.468587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.468604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.468616 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.481124 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:38Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:38 crc kubenswrapper[4763]: E1205 11:49:38.481307 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.482715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.482742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.482751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.482813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.482825 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.585737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.585829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.585848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.585873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.585889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.689939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.689996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.690012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.690039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.690057 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.794104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.794158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.794174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.794197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.794221 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.897387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.897448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.897463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.897485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:38 crc kubenswrapper[4763]: I1205 11:49:38.897499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:38Z","lastTransitionTime":"2025-12-05T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.000566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.000633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.000651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.000673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.000693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.103653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.103721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.103735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.103778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.103799 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.206108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.206581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.206663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.206749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.206857 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.309991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.310032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.310044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.310059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.310070 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.412709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.412745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.412768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.412783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.412793 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.515015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.515045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.515055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.515067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.515076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.617672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.617731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.617742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.617777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.617794 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.719542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.719596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.719604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.719619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.719630 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.783426 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.783502 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.783559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.783720 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:39 crc kubenswrapper[4763]: E1205 11:49:39.783715 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:39 crc kubenswrapper[4763]: E1205 11:49:39.783876 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:39 crc kubenswrapper[4763]: E1205 11:49:39.784076 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:39 crc kubenswrapper[4763]: E1205 11:49:39.784198 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.822073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.822111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.822120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.822133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.822144 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.924598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.924652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.924668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.924693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:39 crc kubenswrapper[4763]: I1205 11:49:39.924710 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:39Z","lastTransitionTime":"2025-12-05T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.028848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.028878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.028886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.028901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.028910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.131469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.131500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.131509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.131524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.131534 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.234520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.234563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.234580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.234605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.234621 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.337223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.337551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.337682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.337827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.338067 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.440362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.440396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.440404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.440418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.440427 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.542237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.542472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.542580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.542683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.542804 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.645096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.645132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.645141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.645157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.645167 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.747466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.747529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.747551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.747578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.747598 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.850660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.850718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.850734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.850789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.850808 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.953824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.953880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.953888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.953900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:40 crc kubenswrapper[4763]: I1205 11:49:40.953927 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:40Z","lastTransitionTime":"2025-12-05T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.056559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.056606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.056614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.056629 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.056639 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.160222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.160272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.160283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.160302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.160315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.262976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.263358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.263607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.263838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.264040 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.366880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.366923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.366934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.366951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.366962 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.469665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.469905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.469987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.470053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.470107 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.571989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.572167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.572233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.572296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.572355 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.674113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.674323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.674409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.674500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.674672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.777506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.777628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.777645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.777715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.777728 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.783270 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:41 crc kubenswrapper[4763]: E1205 11:49:41.783600 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.783790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.783846 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:41 crc kubenswrapper[4763]: E1205 11:49:41.783910 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:41 crc kubenswrapper[4763]: E1205 11:49:41.783935 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.784239 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:41 crc kubenswrapper[4763]: E1205 11:49:41.784598 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.880531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.880558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.880567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.880579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.880589 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.982884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.982920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.982931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.982946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:41 crc kubenswrapper[4763]: I1205 11:49:41.982958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:41Z","lastTransitionTime":"2025-12-05T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.085226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.085294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.085306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.085325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.085336 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.130265 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:42 crc kubenswrapper[4763]: E1205 11:49:42.130468 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:42 crc kubenswrapper[4763]: E1205 11:49:42.130733 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:50:14.130711771 +0000 UTC m=+98.623426634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.188942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.188995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.189010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.189036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.189050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.192145 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/0.log" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.192447 4763 generic.go:334] "Generic (PLEG): container finished" podID="737ae453-c22e-41ea-a10e-7e8f1f165467" containerID="a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe" exitCode=1 Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.192570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerDied","Data":"a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.193385 4763 scope.go:117] "RemoveContainer" containerID="a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.209402 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.223403 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.235201 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.252966 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.265408 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.277085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.288543 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.291781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.292119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.292130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.292150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.292162 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.310301 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.323582 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.335332 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.344634 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.353931 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.368505 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.379722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394371 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.394672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.407350 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.422208 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.441815 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:42Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.497203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.497257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.497268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.497318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.497336 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.600265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.600311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.600321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.600340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.600351 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.702990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.703052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.703073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.703101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.703121 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.806100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.806154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.806171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.806194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.806211 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.910225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.910264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.910274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.910287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:42 crc kubenswrapper[4763]: I1205 11:49:42.910296 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:42Z","lastTransitionTime":"2025-12-05T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.012899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.012941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.012953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.012969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.012980 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.117371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.117404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.117414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.117429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.117438 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.198844 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/0.log" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.198905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerStarted","Data":"a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.220308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.220353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.220365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.220386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.220400 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.221587 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.237592 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.249852 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.261978 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.278414 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.290247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.302324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.311836 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.321809 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.322951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.323112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.323258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.323393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.323508 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.334286 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.344847 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.359814 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.376705 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.390794 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.409238 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.422377 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.426261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.426308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.426324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.426344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.426358 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.436728 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.450826 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:43Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.528743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.528800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.528809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.528822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.528830 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.631018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.631069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.631081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.631098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.631109 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.734015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.734084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.734104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.734164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.734195 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.783323 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.783378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.783391 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.783430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:43 crc kubenswrapper[4763]: E1205 11:49:43.783798 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:43 crc kubenswrapper[4763]: E1205 11:49:43.783872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:43 crc kubenswrapper[4763]: E1205 11:49:43.783930 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:43 crc kubenswrapper[4763]: E1205 11:49:43.783681 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.836987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.837028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.837038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.837056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.837069 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.939968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.940044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.940069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.940101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:43 crc kubenswrapper[4763]: I1205 11:49:43.940124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:43Z","lastTransitionTime":"2025-12-05T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.042883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.042922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.042932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.042950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.042961 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.145217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.145257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.145265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.145280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.145290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.248694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.248737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.248749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.248784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.248797 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.350892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.350960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.350979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.351006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.351024 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.454161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.454222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.454238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.454267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.454285 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.556279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.556308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.556317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.556330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.556339 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.658956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.659008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.659038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.659055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.659064 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.761469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.761521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.761537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.761559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.761574 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.784628 4763 scope.go:117] "RemoveContainer" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" Dec 05 11:49:44 crc kubenswrapper[4763]: E1205 11:49:44.784832 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.864174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.864224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.864240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.864260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.864277 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.967237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.967293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.967311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.967337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:44 crc kubenswrapper[4763]: I1205 11:49:44.967353 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:44Z","lastTransitionTime":"2025-12-05T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.070055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.070105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.070118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.070138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.070150 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.173250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.173292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.173305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.173320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.173330 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.275495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.275535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.275544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.275557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.275566 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.378049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.378097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.378107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.378128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.378142 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.480470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.480537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.480561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.480590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.480611 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.582741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.582821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.582834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.582852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.582863 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.685655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.685706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.685720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.685792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.685806 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.783505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:45 crc kubenswrapper[4763]: E1205 11:49:45.783648 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.783678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:45 crc kubenswrapper[4763]: E1205 11:49:45.783868 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.783899 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.783947 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:45 crc kubenswrapper[4763]: E1205 11:49:45.784024 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:45 crc kubenswrapper[4763]: E1205 11:49:45.784211 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.789419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.789453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.789462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.789479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.789490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.802932 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.820906 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.836253 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.850045 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.871133 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.893726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.893780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.893790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.893809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.893822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.894516 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.910117 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.926396 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.942519 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.959302 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.974274 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.987379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:45Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.996925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.997000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.997017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.997035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:45 crc kubenswrapper[4763]: I1205 11:49:45.997048 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:45Z","lastTransitionTime":"2025-12-05T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.003427 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.017109 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.037155 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.051107 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.062597 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.076221 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:46Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.099674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.099910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.100018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.100105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.100196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.202561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.202617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.202631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.202652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.202662 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.305908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.305961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.305972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.305987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.305998 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.409912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.410143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.410235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.410361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.410435 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.512798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.512860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.512874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.512899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.512913 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.616064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.616127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.616141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.616160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.616174 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.725738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.725893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.725911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.725939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.725955 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.828669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.828719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.828729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.828746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.828774 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.931615 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.931681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.931693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.931715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:46 crc kubenswrapper[4763]: I1205 11:49:46.931730 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:46Z","lastTransitionTime":"2025-12-05T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.035329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.035375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.035387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.035406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.035418 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.137991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.138045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.138057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.138073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.138086 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.240496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.240535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.240546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.240563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.240575 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.344081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.344131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.344142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.344160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.344172 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.447717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.447953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.447976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.448003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.448021 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.551269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.551335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.551345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.551359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.551368 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.654119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.654468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.654597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.654687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.654810 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.757938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.757984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.757996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.758020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.758036 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.783810 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.783861 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.783886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.783852 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:47 crc kubenswrapper[4763]: E1205 11:49:47.784048 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:47 crc kubenswrapper[4763]: E1205 11:49:47.784127 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:47 crc kubenswrapper[4763]: E1205 11:49:47.784266 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:47 crc kubenswrapper[4763]: E1205 11:49:47.784435 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.861303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.861389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.861403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.861430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.861447 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.964633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.964690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.964706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.964729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:47 crc kubenswrapper[4763]: I1205 11:49:47.964745 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:47Z","lastTransitionTime":"2025-12-05T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.068338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.068380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.068394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.068412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.068423 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.171626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.171678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.171690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.171709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.171722 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.273968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.274297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.274426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.274508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.274570 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.378445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.378486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.378497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.378561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.378579 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.482742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.482790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.482799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.482813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.482822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.586336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.586415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.586433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.586461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.586480 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.689851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.690170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.690368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.690530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.690652 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.793564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.793958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.794109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.794406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.794537 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.874666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.874746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.874809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.874840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.874877 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.893700 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:48Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.899431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.899510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.899524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.899551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.899565 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.918137 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:48Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.929288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.929341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.929352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.929369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.929380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.947403 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:48Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.952657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.952720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.952732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.952771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.952786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.972481 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:48Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.977263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.977323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.977337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.977363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.977377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.993805 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:48Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:48 crc kubenswrapper[4763]: E1205 11:49:48.993943 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.998230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.998294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.998308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.998336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:48 crc kubenswrapper[4763]: I1205 11:49:48.998351 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:48Z","lastTransitionTime":"2025-12-05T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.100644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.100681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.100708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.100733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.100750 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.203752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.203817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.203836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.203859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.203879 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.306338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.306402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.306415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.306431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.306470 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.408956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.409005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.409021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.409040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.409052 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.511347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.511394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.511406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.511428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.511441 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.613926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.613974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.613986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.614002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.614016 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.717364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.717412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.717426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.717449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.717465 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.783247 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.783356 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.783375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:49 crc kubenswrapper[4763]: E1205 11:49:49.783457 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.783502 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:49 crc kubenswrapper[4763]: E1205 11:49:49.783567 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:49 crc kubenswrapper[4763]: E1205 11:49:49.783709 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:49 crc kubenswrapper[4763]: E1205 11:49:49.783846 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.819874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.819914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.819924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.819939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.819949 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.922278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.922310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.922322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.922337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:49 crc kubenswrapper[4763]: I1205 11:49:49.922347 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:49Z","lastTransitionTime":"2025-12-05T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.025309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.025344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.025359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.025376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.025387 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.127174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.127249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.127267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.127292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.127310 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.229712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.229823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.229836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.229865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.229880 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.331937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.331972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.331981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.331993 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.332005 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.434215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.434249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.434257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.434271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.434280 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.536551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.536605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.536625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.536649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.536665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.639651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.639713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.639726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.639753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.639791 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.743146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.743195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.743205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.743229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.743248 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.846162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.846215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.846229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.846251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.846266 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.949480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.949540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.949558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.949702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:50 crc kubenswrapper[4763]: I1205 11:49:50.949951 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:50Z","lastTransitionTime":"2025-12-05T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.053972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.054026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.054037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.054067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.054100 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.157024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.157070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.157086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.157110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.157127 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.259620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.259657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.259667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.259682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.259693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.362323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.362391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.362406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.362433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.362446 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.465943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.466024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.466046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.466076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.466094 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.569740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.569802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.569813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.569832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.569845 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.673611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.673662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.673676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.673695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.673708 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.777046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.777089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.777099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.777114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.777124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.783293 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.783340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.783348 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:51 crc kubenswrapper[4763]: E1205 11:49:51.783399 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.783416 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:51 crc kubenswrapper[4763]: E1205 11:49:51.783539 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:51 crc kubenswrapper[4763]: E1205 11:49:51.783662 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:51 crc kubenswrapper[4763]: E1205 11:49:51.783772 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.879396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.879456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.879476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.879499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.879513 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.982226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.982291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.982303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.982320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:51 crc kubenswrapper[4763]: I1205 11:49:51.982331 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:51Z","lastTransitionTime":"2025-12-05T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.087357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.087971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.088215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.088322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.088415 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.191520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.191573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.191584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.191607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.191621 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.294927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.294989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.295001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.295024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.295039 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.398059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.398148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.398175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.398211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.398236 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.501647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.501707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.501720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.501741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.501753 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.604568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.604639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.604653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.604677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.604692 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.707890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.707998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.708026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.708057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.708075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.810808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.810888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.810913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.810957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.811014 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.914536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.914581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.914591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.914610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:52 crc kubenswrapper[4763]: I1205 11:49:52.914620 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:52Z","lastTransitionTime":"2025-12-05T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.017399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.017459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.017472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.017494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.017508 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.120279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.120319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.120329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.120352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.120363 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.226274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.226350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.226362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.226386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.226404 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.329041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.329093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.329104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.329150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.329163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.432146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.432263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.432284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.432312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.432332 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.534855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.534940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.534973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.535004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.535020 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.637923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.638018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.638042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.638073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.638093 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.741232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.741289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.741301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.741323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.741336 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.783951 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.784015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.783960 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.783946 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:53 crc kubenswrapper[4763]: E1205 11:49:53.784151 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:53 crc kubenswrapper[4763]: E1205 11:49:53.784304 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:53 crc kubenswrapper[4763]: E1205 11:49:53.784460 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:53 crc kubenswrapper[4763]: E1205 11:49:53.784791 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.844955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.845111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.845135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.845166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.845190 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.948257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.948365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.948384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.948452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:53 crc kubenswrapper[4763]: I1205 11:49:53.948472 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:53Z","lastTransitionTime":"2025-12-05T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.052189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.052237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.052250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.052269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.052283 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.155512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.155582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.155613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.155645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.155671 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.258330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.258446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.258470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.258500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.258521 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.362110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.362213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.362255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.362290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.362314 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.465158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.465229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.465246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.465270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.465289 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.568913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.568983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.569006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.569036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.569062 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.671740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.671854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.671874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.671912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.671929 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.774864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.774929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.774952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.774999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.775025 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.878041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.878111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.878125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.878141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.878154 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.980648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.980686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.980697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.980714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:54 crc kubenswrapper[4763]: I1205 11:49:54.980727 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:54Z","lastTransitionTime":"2025-12-05T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.083541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.083592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.083603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.083626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.083638 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.186655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.186709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.186724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.186746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.186785 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.289350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.289466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.289485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.289514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.289529 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.393243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.393281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.393291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.393307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.393318 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.495926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.495984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.496000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.496024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.496082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.599321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.600229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.600307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.600433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.600519 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.704325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.704630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.704737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.704846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.704915 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.782973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.783037 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:55 crc kubenswrapper[4763]: E1205 11:49:55.783106 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.783144 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:55 crc kubenswrapper[4763]: E1205 11:49:55.783265 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:55 crc kubenswrapper[4763]: E1205 11:49:55.783374 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.783705 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:55 crc kubenswrapper[4763]: E1205 11:49:55.783847 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.800044 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.806689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.806743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.806794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.806819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.806836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.824049 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.846409 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.861201 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.878366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.894119 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.908689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.908730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.908742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.908779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.908791 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:55Z","lastTransitionTime":"2025-12-05T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.913069 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.926424 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.942053 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.961058 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.984531 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:55 crc kubenswrapper[4763]: I1205 11:49:55.997260 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:55Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.007891 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.011202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.011351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.011416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.011492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.011562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.016942 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.030157 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.042856 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.052643 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.062317 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:56Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.113875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.114165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.114229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.114301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.114364 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.217164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.217242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.217265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.217294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.217315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.320081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.320155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.320176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.320207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.320229 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.422749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.422798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.422809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.422825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.422836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.525016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.525270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.525296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.525324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.525348 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.628682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.628741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.628753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.628797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.628815 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.731652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.731693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.731704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.731720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.731732 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.835334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.835379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.835391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.835411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.835424 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.938983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.939033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.939043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.939060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:56 crc kubenswrapper[4763]: I1205 11:49:56.939071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:56Z","lastTransitionTime":"2025-12-05T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.042167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.042214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.042233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.042255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.042272 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.144633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.144690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.144706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.144728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.144743 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.246858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.246909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.246919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.246935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.246946 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.350841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.350920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.350940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.350974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.350993 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.454431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.455038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.455270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.455496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.455719 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.559251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.559502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.559535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.559574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.559601 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.662432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.662503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.662521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.662550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.662568 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.765873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.765933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.765955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.765985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.766008 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.783576 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.783634 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.783647 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:57 crc kubenswrapper[4763]: E1205 11:49:57.783745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:57 crc kubenswrapper[4763]: E1205 11:49:57.783880 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.783910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:57 crc kubenswrapper[4763]: E1205 11:49:57.784014 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:57 crc kubenswrapper[4763]: E1205 11:49:57.784126 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.869559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.869611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.869627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.869653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.869671 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.972454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.972512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.972531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.972555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:57 crc kubenswrapper[4763]: I1205 11:49:57.972572 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:57Z","lastTransitionTime":"2025-12-05T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.075147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.075202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.075223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.075253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.075274 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.178517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.178580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.178602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.178632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.178655 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.281563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.281640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.281655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.281671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.281682 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.384395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.384445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.384459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.384476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.384488 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.486528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.486567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.486578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.486597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.486609 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.589436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.589466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.589475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.589487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.589501 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.693052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.693093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.693103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.693135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.693145 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.784892 4763 scope.go:117] "RemoveContainer" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.795566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.795623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.795639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.795669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.795686 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.899131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.899187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.899205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.899237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:58 crc kubenswrapper[4763]: I1205 11:49:58.899261 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:58Z","lastTransitionTime":"2025-12-05T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.001928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.002352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.002580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.002860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.003080 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.106113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.106176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.106193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.106216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.106233 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.210297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.210468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.210576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.210614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.210643 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.258950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.259015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.259038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.259058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.259075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.279751 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.285332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.285400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.285424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.285454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.285475 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.305714 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.310180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.310246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.310269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.310299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.310322 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.330843 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.335803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.335856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.335872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.335895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.335915 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.355331 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.360227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.360278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.360294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.360318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.360336 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.376246 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"424a6449-0727-4c68-964f-19010b8ff35b\\\",\\\"systemUUID\\\":\\\"0f54ec5d-9350-4cf9-9a1e-213de5460351\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:49:59Z is after 2025-08-24T17:21:41Z" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.377098 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.378674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.378715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.378730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.378749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.378782 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.481854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.481924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.481941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.481969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.481987 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.584719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.584807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.584825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.584853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.584870 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.687837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.687913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.687939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.687970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.687991 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.719353 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.719517 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.7194805 +0000 UTC m=+148.212195253 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.719609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.719713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.719928 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.720020 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.719999154 +0000 UTC m=+148.212713917 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.720093 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.720217 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.720196166 +0000 UTC m=+148.212910899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.783740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.783863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.783908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.784072 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.784164 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.784360 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.784494 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.784621 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.790562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.790819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.790900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.790973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.791043 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.820959 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821180 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821232 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821253 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821327 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.821303089 +0000 UTC m=+148.314017852 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821364 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821383 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821392 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:59 crc kubenswrapper[4763]: E1205 11:49:59.821431 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.82142169 +0000 UTC m=+148.314136413 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.821213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.894221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.894265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.894277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.894295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.894306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.996562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.996622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.996632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.996645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:49:59 crc kubenswrapper[4763]: I1205 11:49:59.996654 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:49:59Z","lastTransitionTime":"2025-12-05T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.099956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.099997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.100009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.100026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.100042 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.202521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.202560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.202571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.202587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.202597 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.257876 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/2.log" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.261095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.261618 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.287858 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.302185 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.304902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.304940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.304953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.304971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.304982 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.313219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.324062 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.333920 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.345949 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.356509 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.369516 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.382803 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.407387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.407432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.407446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.407466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.407480 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.416285 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.430368 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.443093 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.453714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.468974 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.482726 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.496538 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.509287 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.510337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.510370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.510392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.510408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.510420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.522912 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:00Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.612721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.612801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.612814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.612832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.612843 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.714997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.715044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.715059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.715081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.715095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.817303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.817348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.817357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.817375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.817383 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.920847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.920914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.920951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.920982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:00 crc kubenswrapper[4763]: I1205 11:50:00.921005 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:00Z","lastTransitionTime":"2025-12-05T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.023885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.023931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.023941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.023955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.023965 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.127179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.127269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.127293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.127329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.127352 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.229356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.229417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.229429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.229449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.229463 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.265742 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/3.log" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.266445 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/2.log" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.268973 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" exitCode=1 Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.269006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.269058 4763 scope.go:117] "RemoveContainer" containerID="455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.270026 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 11:50:01 crc kubenswrapper[4763]: E1205 11:50:01.273008 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.285255 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.296669 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.307530 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.318451 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.331923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.331959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.331972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.331987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.331999 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.332462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.351120 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.363542 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.380247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.389694 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.401567 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.413933 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.424051 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.434061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.434106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.434118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.434151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.434167 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.436911 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.448706 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.470755 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455df6b1dbc6c362e75d34ddc09347b0c7b31571afb56538b9b7768a5abb50d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:29Z\\\",\\\"message\\\":\\\"ring]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 11:49:29.602160 6459 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1205 11:49:29.602174 6459 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1205 11:49:29.602174 6459 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:50:00Z\\\",\\\"message\\\":\\\"trics for network=default are: map[]\\\\nI1205 11:50:00.504070 6870 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.161\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1205 11:50:00.503104 6870 services_controller.go:360] Finished syncing service ovn-kubernetes-control-plane on namespace openshift-ovn-kubernetes for network=default : 11.05µs\\\\nI1205 11:50:00.504110 6870 services_controller.go:444] Built service openshift-config-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1205 11:50:00.504112 6870 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.485352 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.497979 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.509289 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:01Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.536952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.537003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.537016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.537034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.537048 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.639846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.639914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.639934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.639961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.639985 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.742102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.742158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.742175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.742196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.742210 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.783350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.783564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.783591 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:01 crc kubenswrapper[4763]: E1205 11:50:01.783822 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.783865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:01 crc kubenswrapper[4763]: E1205 11:50:01.783994 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:01 crc kubenswrapper[4763]: E1205 11:50:01.784155 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:01 crc kubenswrapper[4763]: E1205 11:50:01.784307 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.844596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.844640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.844653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.844670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.844682 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.947410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.947453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.947462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.947478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:01 crc kubenswrapper[4763]: I1205 11:50:01.947487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:01Z","lastTransitionTime":"2025-12-05T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.050030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.050072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.050081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.050100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.050111 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.152937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.153009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.153041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.153071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.153092 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.255332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.255386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.255399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.255417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.255434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.275076 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/3.log" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.282403 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 11:50:02 crc kubenswrapper[4763]: E1205 11:50:02.282620 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.302141 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b31fb483-2a86-4e31-a611-ef92a9a4840f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452ef64735a1911b994bd2bfbf1702e3ff68648cd0b834c254dd24f0b17cd9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1951ae154cb3c61f4cf3cadec35e19591efe1f768993d1e51713195175547b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a3a0d8531f8696597e2389d8cd7b50130de93ac8be572b329ebe2eff8c6b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.322124 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kwkp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"737ae453-c22e-41ea-a10e-7e8f1f165467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:49:41Z\\\",\\\"message\\\":\\\"2025-12-05T11:48:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef\\\\n2025-12-05T11:48:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a7099953-0e08-436a-aafd-3159a86e6eef to /host/opt/cni/bin/\\\\n2025-12-05T11:48:56Z [verbose] multus-daemon started\\\\n2025-12-05T11:48:56Z [verbose] Readiness Indicator file check\\\\n2025-12-05T11:49:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q79tg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kwkp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.339822 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d71e854a-fd6a-4efa-9cf4-a5dc75a1901f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40808a5c1a367c907ecf165483a1cc5fc59027c61342399abb612be6fed84fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d865b70b752e983282ac08c190f254185cbf7bba8d13c0238261fd7b880ac651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9g4jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.359211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.359307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.359358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.359397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.359411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.361637 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-92qpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb92473b-8e13-46cd-9c26-9ef67d1d6e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c630804aeb54582259498705c520ab6d2c191844861069ba7090226028a5716f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17983755afff88fdeb55e97265e8b33c9f058bc88d38cf9740cba99a11c28afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e02e582837d7836f9f67667b9cff2823bc06265ece22895bf9279467816552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dad2dc2557bb3e2aca5e7e7f3a9605cbae6c45ed3d77b32cd38fbae3f7c617c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://125bf0986a0c010496180a171c3ea3b7fae59878fc53cb6bf5fc1894f82fe622\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e95f44498a7dd0ee50dbd86b90f9baa623558c259fd0d8ed02e8d7ada6b99e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4edc68a2019dd5ccfa39b9e940e04bb872938196a9b47dd24485c4d69ec048a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:49:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpjxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-92qpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.397676 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78167d26-8492-4035-99de-987c99ab12e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b874a765fa9a075a2662b615419d246c610bf2eea128d4bf53c16bda49f38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd875ae389a8e48a2f958f87ef711a4053a54da61f415ef9a380f8b3de9d7012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12b24f5f9e73a1f4871d8d666bf6aabef79ec707d3df055e3e33a1a158dce464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25460658bdce8262dc2a0269161e8377dcb37f440c09c8eea9f7b397c9d3a5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8aab7caa6c02035af6faae1817d239b12bc2afcb11c7e14c4aea1df60372dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa5ae2befb7d771af5f326bd284c6812da0030029e3d0e61b1def9e500d22a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7458ffcc6665bf373a83cddf1620b981647b463402eac7f2c4c550307cbc756\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8160f911206a4ac631cfcd40dce62f6ebecd598ab00ecd9822593fd20a8afb5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.423146 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4640d76c7ecaa13523f23448f53bc51afe8279734a6bc254b403cf436fe0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7619d39a9f2a46d92fc3ec5ab36f60ec2af3a789b10decb4cbe1cdaef58d28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.443085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96338136-6831-49d0-9eb9-77d1205c6afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b662bafa4fee3b2c7cd96bc49d6054b0fbe349e86ce1d4faf967edbaf8f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgs2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpgln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.459733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gt7x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2515d84f-5782-48a0-9d7e-9704baebe26c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60a4a422ae6f3c2c05e5dc721674edbb0efda81d198ede925d3dd1cf99e09b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxkb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gt7x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.462665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.462906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.463067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.463230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.463394 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.480264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.501402 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.517908 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gxbp8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae86f1f4-06e2-47ef-80e3-f692e44cce3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e177b5c0a61aeedb5c6bf5f6340f212506106592ad10807ed29d83a4a6949ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9s79r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gxbp8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.539356 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-x45qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hszt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:49:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-x45qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.558959 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38a2b47fe97303fc60f6ed0de5927b73993234170c4fdb02b45f0762978e292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.566672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.566733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.566751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.567087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.567108 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.587521 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b42a5472-7487-4146-87a1-b83999821399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T11:50:00Z\\\",\\\"message\\\":\\\"trics for network=default are: map[]\\\\nI1205 11:50:00.504070 6870 services_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.161\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1205 11:50:00.503104 6870 services_controller.go:360] Finished syncing service ovn-kubernetes-control-plane on namespace openshift-ovn-kubernetes for network=default : 11.05µs\\\\nI1205 11:50:00.504110 6870 services_controller.go:444] Built service openshift-config-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1205 11:50:00.504112 6870 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nq2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbr2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.612248 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T11:48:54Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 11:48:54.822889 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 11:48:54.823032 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 11:48:54.823055 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 11:48:54.823033 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 11:48:54.823371 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764935318\\\\\\\\\\\\\\\" (2025-12-05 11:48:38 +0000 UTC to 2026-01-04 11:48:39 +0000 UTC (now=2025-12-05 11:48:54.823293564 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823498 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3383997532/tls.crt::/tmp/serving-cert-3383997532/tls.key\\\\\\\"\\\\nI1205 11:48:54.823746 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764935329\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764935329\\\\\\\\\\\\\\\" (2025-12-05 10:48:49 +0000 UTC to 2026-12-05 10:48:49 +0000 UTC (now=2025-12-05 11:48:54.823700006 +0000 UTC))\\\\\\\"\\\\nI1205 11:48:54.823836 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1205 11:48:54.823876 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1205 11:48:54.823905 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1205 11:48:54.826863 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.634412 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04d4b8c3-b148-45da-b0d1-412870026308\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://745228c1f753278be19f5bb7197fd2dd01178d667213f59c8933c18a5d666c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ff65480597f13016cbd96d14661a9cd428eff4ca644c88451ef9c28b1e015f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9316ee7cdcfff2c1c600186568802e8c4d39ddcb029e17d8f109bb9f813e8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54939667d3da9326369dc17207cec7eb95f69fe46040d3dd06d10dd3df8cce0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T11:48:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T11:48:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T11:48:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.648733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafa237bfeb83d5895f8ad73cf881e1fd946ec837402ea271d4aa1b75469da0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T11:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.663389 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T11:48:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T11:50:02Z is after 2025-08-24T17:21:41Z" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.670870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.670907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.670919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.670942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.670956 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.773936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.773991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.774000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.774013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.774022 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.797266 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.876665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.876711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.876723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.876743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.876755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.979844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.979874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.979882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.979895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:02 crc kubenswrapper[4763]: I1205 11:50:02.979902 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:02Z","lastTransitionTime":"2025-12-05T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.082919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.082967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.082978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.082997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.083012 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.186273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.186344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.186365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.186392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.186409 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.289542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.289606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.289634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.289666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.289688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.391936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.391972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.391984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.391999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.392069 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.495409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.495447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.495461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.495480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.495494 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.598523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.598564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.598576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.598593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.598606 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.700822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.700869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.700895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.700913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.700924 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.784022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.784054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.784301 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:03 crc kubenswrapper[4763]: E1205 11:50:03.784392 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:03 crc kubenswrapper[4763]: E1205 11:50:03.784442 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:03 crc kubenswrapper[4763]: E1205 11:50:03.784543 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.783950 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:03 crc kubenswrapper[4763]: E1205 11:50:03.785256 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.803143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.803172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.803181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.803194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.803204 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.905531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.905556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.905564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.905578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:03 crc kubenswrapper[4763]: I1205 11:50:03.905587 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:03Z","lastTransitionTime":"2025-12-05T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.008996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.009114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.009151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.009184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.009203 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.113009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.113050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.113059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.113076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.113085 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.215309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.215403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.215421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.215446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.215463 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.317924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.318024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.318052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.318085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.318112 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.421072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.421131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.421148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.421172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.421190 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.524906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.525005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.525024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.525050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.525069 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.627388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.627425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.627439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.627460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.627472 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.730906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.731004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.731015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.731035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.731047 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.833212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.833307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.833332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.833359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.833380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.936316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.936374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.936391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.936417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:04 crc kubenswrapper[4763]: I1205 11:50:04.936429 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:04Z","lastTransitionTime":"2025-12-05T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.039609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.039676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.039695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.039715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.039730 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.142674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.142711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.142723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.142742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.142754 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.245075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.245111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.245139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.245153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.245161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.348018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.348481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.348643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.348868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.349065 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.451481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.451544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.451561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.451585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.451598 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.554370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.554445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.554463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.554504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.554542 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.657922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.658160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.658293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.658386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.658462 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.760811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.761005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.761020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.761036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.761048 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.783903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.784242 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.784441 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:05 crc kubenswrapper[4763]: E1205 11:50:05.784553 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:05 crc kubenswrapper[4763]: E1205 11:50:05.784400 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.784471 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:05 crc kubenswrapper[4763]: E1205 11:50:05.785211 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:05 crc kubenswrapper[4763]: E1205 11:50:05.785249 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.825535 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.825519446 podStartE2EDuration="1m8.825519446s" podCreationTimestamp="2025-12-05 11:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.825256044 +0000 UTC m=+90.317970777" watchObservedRunningTime="2025-12-05 11:50:05.825519446 +0000 UTC m=+90.318234169" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.864413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.864451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.864462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.864478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.864490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.876910 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podStartSLOduration=71.876890584 podStartE2EDuration="1m11.876890584s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.865250586 +0000 UTC m=+90.357965309" watchObservedRunningTime="2025-12-05 11:50:05.876890584 +0000 UTC m=+90.369605327" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.877088 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gt7x4" podStartSLOduration=71.877082315 podStartE2EDuration="1m11.877082315s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.8764374 +0000 UTC m=+90.369152133" watchObservedRunningTime="2025-12-05 11:50:05.877082315 +0000 UTC m=+90.369797038" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.916382 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-92qpt" podStartSLOduration=70.916366331 podStartE2EDuration="1m10.916366331s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.897546989 +0000 UTC m=+90.390261722" watchObservedRunningTime="2025-12-05 11:50:05.916366331 +0000 UTC m=+90.409081064" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.950168 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gxbp8" podStartSLOduration=71.950151996 podStartE2EDuration="1m11.950151996s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.948044671 +0000 UTC m=+90.440759394" watchObservedRunningTime="2025-12-05 11:50:05.950151996 +0000 UTC m=+90.442866719" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.967041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.967070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.967078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.967091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.967100 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:05Z","lastTransitionTime":"2025-12-05T11:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.984857 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.984834888 podStartE2EDuration="1m11.984834888s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.984184213 +0000 UTC m=+90.476898936" watchObservedRunningTime="2025-12-05 11:50:05.984834888 +0000 UTC m=+90.477549621" Dec 05 11:50:05 crc kubenswrapper[4763]: I1205 11:50:05.997820 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.997799196 podStartE2EDuration="40.997799196s" podCreationTimestamp="2025-12-05 11:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:05.996547537 +0000 UTC m=+90.489262270" watchObservedRunningTime="2025-12-05 11:50:05.997799196 +0000 UTC m=+90.490513919" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.069382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.069414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.069423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.069434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.069443 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.083771 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.083737835 podStartE2EDuration="4.083737835s" podCreationTimestamp="2025-12-05 11:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:06.083340722 +0000 UTC m=+90.576055475" watchObservedRunningTime="2025-12-05 11:50:06.083737835 +0000 UTC m=+90.576452568" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.100698 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=69.100682723 podStartE2EDuration="1m9.100682723s" podCreationTimestamp="2025-12-05 11:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:06.099950277 +0000 UTC m=+90.592665030" watchObservedRunningTime="2025-12-05 11:50:06.100682723 +0000 UTC m=+90.593397466" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.126183 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kwkp4" podStartSLOduration=72.126168145 podStartE2EDuration="1m12.126168145s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:06.115716346 +0000 UTC m=+90.608431069" watchObservedRunningTime="2025-12-05 11:50:06.126168145 +0000 UTC m=+90.618882868" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.126594 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptcsb" podStartSLOduration=71.126590948 podStartE2EDuration="1m11.126590948s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:06.12549923 +0000 UTC m=+90.618213983" watchObservedRunningTime="2025-12-05 11:50:06.126590948 +0000 UTC m=+90.619305671" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.172125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.172154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.172163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.172177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.172187 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.274618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.274657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.274665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.274680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.274688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.377581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.377658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.377681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.377711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.377733 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.480517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.480582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.480600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.480627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.480644 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.583879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.583946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.583968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.583997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.584045 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.687865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.688318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.688493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.688687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.688884 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.791716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.791795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.791815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.791838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.791855 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.894586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.894621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.894635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.894655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.894679 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.997527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.997575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.997583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.997597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:06 crc kubenswrapper[4763]: I1205 11:50:06.997615 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:06Z","lastTransitionTime":"2025-12-05T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.101678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.101822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.101844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.101872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.101894 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.204095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.204145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.204160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.204180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.204196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.307058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.307147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.307175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.307208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.307231 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.410113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.410211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.410232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.410267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.410290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.513057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.513096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.513106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.513119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.513130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.616504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.616874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.617029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.617161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.617344 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.720833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.720870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.720885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.720919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.720939 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.783604 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:07 crc kubenswrapper[4763]: E1205 11:50:07.783789 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.783971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:07 crc kubenswrapper[4763]: E1205 11:50:07.784051 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.784076 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:07 crc kubenswrapper[4763]: E1205 11:50:07.784151 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.784152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:07 crc kubenswrapper[4763]: E1205 11:50:07.784361 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.824155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.824251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.824274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.824300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.824317 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.927599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.927666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.927684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.927711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:07 crc kubenswrapper[4763]: I1205 11:50:07.927728 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:07Z","lastTransitionTime":"2025-12-05T11:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.031543 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.031605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.031616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.031636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.031645 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.135297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.135350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.135368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.135391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.135408 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.238324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.238399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.238422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.238451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.238475 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.340390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.340437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.340446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.340463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.340477 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.443588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.443676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.443700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.443732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.443752 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.546148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.546231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.546252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.546279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.546301 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.649256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.649353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.649381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.649411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.649449 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.752874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.752934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.752945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.752970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.752988 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.855722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.855838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.855860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.855892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.855914 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.959686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.959835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.959887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.959927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:08 crc kubenswrapper[4763]: I1205 11:50:08.959954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:08Z","lastTransitionTime":"2025-12-05T11:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.063235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.063300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.063314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.063339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.063356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.166299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.166393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.166423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.166454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.166474 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.269553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.269637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.269657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.269694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.269719 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.372837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.372902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.372925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.372954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.372975 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.476193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.476272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.476294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.476327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.476349 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.579154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.579217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.579238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.579263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.579281 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.682425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.682478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.682491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.682510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.682527 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.685975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.686105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.686229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.686342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.686432 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T11:50:09Z","lastTransitionTime":"2025-12-05T11:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.742919 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9"] Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.743328 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.746830 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.747463 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.747584 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.747860 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.783334 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.783462 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:09 crc kubenswrapper[4763]: E1205 11:50:09.783487 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.783602 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:09 crc kubenswrapper[4763]: E1205 11:50:09.783714 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:09 crc kubenswrapper[4763]: E1205 11:50:09.783818 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.783854 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:09 crc kubenswrapper[4763]: E1205 11:50:09.783987 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.834665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.834755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7c77c9-eb28-4165-babd-f875364112aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.835524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db7c77c9-eb28-4165-babd-f875364112aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.835666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.835711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db7c77c9-eb28-4165-babd-f875364112aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.936948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db7c77c9-eb28-4165-babd-f875364112aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937122 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7c77c9-eb28-4165-babd-f875364112aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937149 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db7c77c9-eb28-4165-babd-f875364112aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.937750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/db7c77c9-eb28-4165-babd-f875364112aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.938119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db7c77c9-eb28-4165-babd-f875364112aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.945654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db7c77c9-eb28-4165-babd-f875364112aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:09 crc kubenswrapper[4763]: I1205 11:50:09.958648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7c77c9-eb28-4165-babd-f875364112aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ls9d9\" (UID: \"db7c77c9-eb28-4165-babd-f875364112aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:10 crc kubenswrapper[4763]: I1205 11:50:10.059177 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" Dec 05 11:50:10 crc kubenswrapper[4763]: I1205 11:50:10.310136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" event={"ID":"db7c77c9-eb28-4165-babd-f875364112aa","Type":"ContainerStarted","Data":"538e3693ab27714ce9e5adfd8bd1aee67b5d13d41c502a34932173d32ea3a057"} Dec 05 11:50:10 crc kubenswrapper[4763]: I1205 11:50:10.310182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" event={"ID":"db7c77c9-eb28-4165-babd-f875364112aa","Type":"ContainerStarted","Data":"0c602cd2e3425f5d0ea5de45dafeef412a6241eee2f280c4860ba8f191ee8fc8"} Dec 05 11:50:11 crc kubenswrapper[4763]: I1205 11:50:11.784032 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:11 crc kubenswrapper[4763]: I1205 11:50:11.784136 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:11 crc kubenswrapper[4763]: I1205 11:50:11.784184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:11 crc kubenswrapper[4763]: E1205 11:50:11.784292 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:11 crc kubenswrapper[4763]: E1205 11:50:11.784509 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:11 crc kubenswrapper[4763]: E1205 11:50:11.784576 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:11 crc kubenswrapper[4763]: I1205 11:50:11.784990 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:11 crc kubenswrapper[4763]: E1205 11:50:11.785211 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:13 crc kubenswrapper[4763]: I1205 11:50:13.783318 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:13 crc kubenswrapper[4763]: E1205 11:50:13.783950 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:13 crc kubenswrapper[4763]: I1205 11:50:13.783409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:13 crc kubenswrapper[4763]: I1205 11:50:13.784076 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 11:50:13 crc kubenswrapper[4763]: E1205 11:50:13.784119 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:13 crc kubenswrapper[4763]: I1205 11:50:13.783724 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:13 crc kubenswrapper[4763]: E1205 11:50:13.784210 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:50:13 crc kubenswrapper[4763]: I1205 11:50:13.783389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:13 crc kubenswrapper[4763]: E1205 11:50:13.784271 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:13 crc kubenswrapper[4763]: E1205 11:50:13.784406 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:14 crc kubenswrapper[4763]: I1205 11:50:14.184079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:14 crc kubenswrapper[4763]: E1205 11:50:14.184283 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:50:14 crc kubenswrapper[4763]: E1205 11:50:14.184406 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs podName:a135c32b-38e4-43f6-bbb1-d1b8e42156ab nodeName:}" failed. No retries permitted until 2025-12-05 11:51:18.184378819 +0000 UTC m=+162.677093582 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs") pod "network-metrics-daemon-x45qv" (UID: "a135c32b-38e4-43f6-bbb1-d1b8e42156ab") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 11:50:15 crc kubenswrapper[4763]: I1205 11:50:15.783732 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:15 crc kubenswrapper[4763]: E1205 11:50:15.783976 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:15 crc kubenswrapper[4763]: I1205 11:50:15.784245 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:15 crc kubenswrapper[4763]: I1205 11:50:15.784270 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:15 crc kubenswrapper[4763]: I1205 11:50:15.784345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:15 crc kubenswrapper[4763]: E1205 11:50:15.786240 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:15 crc kubenswrapper[4763]: E1205 11:50:15.786389 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:15 crc kubenswrapper[4763]: E1205 11:50:15.786568 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:17 crc kubenswrapper[4763]: I1205 11:50:17.783678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:17 crc kubenswrapper[4763]: E1205 11:50:17.783838 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:17 crc kubenswrapper[4763]: I1205 11:50:17.783920 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:17 crc kubenswrapper[4763]: I1205 11:50:17.783931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:17 crc kubenswrapper[4763]: E1205 11:50:17.783976 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:17 crc kubenswrapper[4763]: E1205 11:50:17.784111 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:17 crc kubenswrapper[4763]: I1205 11:50:17.784378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:17 crc kubenswrapper[4763]: E1205 11:50:17.784543 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:19 crc kubenswrapper[4763]: I1205 11:50:19.783194 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:19 crc kubenswrapper[4763]: I1205 11:50:19.783221 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:19 crc kubenswrapper[4763]: E1205 11:50:19.783345 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:19 crc kubenswrapper[4763]: I1205 11:50:19.783371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:19 crc kubenswrapper[4763]: I1205 11:50:19.783418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:19 crc kubenswrapper[4763]: E1205 11:50:19.783504 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:19 crc kubenswrapper[4763]: E1205 11:50:19.783588 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:19 crc kubenswrapper[4763]: E1205 11:50:19.783660 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:21 crc kubenswrapper[4763]: I1205 11:50:21.784084 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:21 crc kubenswrapper[4763]: I1205 11:50:21.784110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:21 crc kubenswrapper[4763]: I1205 11:50:21.784186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:21 crc kubenswrapper[4763]: I1205 11:50:21.784193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:21 crc kubenswrapper[4763]: E1205 11:50:21.784389 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:21 crc kubenswrapper[4763]: E1205 11:50:21.784552 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:21 crc kubenswrapper[4763]: E1205 11:50:21.784965 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:21 crc kubenswrapper[4763]: E1205 11:50:21.785053 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:23 crc kubenswrapper[4763]: I1205 11:50:23.783821 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:23 crc kubenswrapper[4763]: I1205 11:50:23.784001 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:23 crc kubenswrapper[4763]: E1205 11:50:23.784050 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:23 crc kubenswrapper[4763]: I1205 11:50:23.784117 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:23 crc kubenswrapper[4763]: I1205 11:50:23.783832 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:23 crc kubenswrapper[4763]: E1205 11:50:23.784245 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:23 crc kubenswrapper[4763]: E1205 11:50:23.784336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:23 crc kubenswrapper[4763]: E1205 11:50:23.784410 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:25 crc kubenswrapper[4763]: I1205 11:50:25.783799 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:25 crc kubenswrapper[4763]: I1205 11:50:25.783858 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:25 crc kubenswrapper[4763]: I1205 11:50:25.783897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:25 crc kubenswrapper[4763]: E1205 11:50:25.785636 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:25 crc kubenswrapper[4763]: I1205 11:50:25.785711 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:25 crc kubenswrapper[4763]: E1205 11:50:25.785872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:25 crc kubenswrapper[4763]: E1205 11:50:25.786137 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:25 crc kubenswrapper[4763]: E1205 11:50:25.786248 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:27 crc kubenswrapper[4763]: I1205 11:50:27.783661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:27 crc kubenswrapper[4763]: E1205 11:50:27.784531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:27 crc kubenswrapper[4763]: I1205 11:50:27.783886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:27 crc kubenswrapper[4763]: I1205 11:50:27.783907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:27 crc kubenswrapper[4763]: I1205 11:50:27.783672 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:27 crc kubenswrapper[4763]: E1205 11:50:27.784947 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:27 crc kubenswrapper[4763]: E1205 11:50:27.785006 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:27 crc kubenswrapper[4763]: E1205 11:50:27.785054 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.370078 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/1.log" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.370707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/0.log" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.370980 4763 generic.go:334] "Generic (PLEG): container finished" podID="737ae453-c22e-41ea-a10e-7e8f1f165467" containerID="a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d" exitCode=1 Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.371048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerDied","Data":"a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d"} Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.371265 4763 scope.go:117] "RemoveContainer" containerID="a3e15b77f46c346d83067726607bdf734df780fdf85659f04be9f0b28faa25fe" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.371781 4763 scope.go:117] "RemoveContainer" containerID="a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d" Dec 05 11:50:28 crc kubenswrapper[4763]: E1205 11:50:28.372048 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kwkp4_openshift-multus(737ae453-c22e-41ea-a10e-7e8f1f165467)\"" pod="openshift-multus/multus-kwkp4" podUID="737ae453-c22e-41ea-a10e-7e8f1f165467" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.390194 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ls9d9" podStartSLOduration=94.3901766 podStartE2EDuration="1m34.3901766s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:10.323638688 +0000 UTC m=+94.816353421" watchObservedRunningTime="2025-12-05 11:50:28.3901766 +0000 UTC m=+112.882891323" Dec 05 11:50:28 crc kubenswrapper[4763]: I1205 11:50:28.784930 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 11:50:28 crc kubenswrapper[4763]: E1205 11:50:28.785194 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbr2p_openshift-ovn-kubernetes(b42a5472-7487-4146-87a1-b83999821399)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" Dec 05 11:50:29 crc kubenswrapper[4763]: I1205 11:50:29.376301 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/1.log" Dec 05 11:50:29 crc kubenswrapper[4763]: I1205 11:50:29.783687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:29 crc kubenswrapper[4763]: I1205 11:50:29.783786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:29 crc kubenswrapper[4763]: E1205 11:50:29.783847 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:29 crc kubenswrapper[4763]: E1205 11:50:29.783954 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:29 crc kubenswrapper[4763]: I1205 11:50:29.784033 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:29 crc kubenswrapper[4763]: E1205 11:50:29.784097 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:29 crc kubenswrapper[4763]: I1205 11:50:29.784116 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:29 crc kubenswrapper[4763]: E1205 11:50:29.784163 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:31 crc kubenswrapper[4763]: I1205 11:50:31.783960 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:31 crc kubenswrapper[4763]: I1205 11:50:31.784081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:31 crc kubenswrapper[4763]: E1205 11:50:31.784167 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:31 crc kubenswrapper[4763]: I1205 11:50:31.784240 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:31 crc kubenswrapper[4763]: E1205 11:50:31.784348 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:31 crc kubenswrapper[4763]: E1205 11:50:31.784619 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:31 crc kubenswrapper[4763]: I1205 11:50:31.784680 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:31 crc kubenswrapper[4763]: E1205 11:50:31.784876 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:33 crc kubenswrapper[4763]: I1205 11:50:33.783696 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:33 crc kubenswrapper[4763]: I1205 11:50:33.783777 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:33 crc kubenswrapper[4763]: I1205 11:50:33.783861 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:33 crc kubenswrapper[4763]: E1205 11:50:33.783927 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:33 crc kubenswrapper[4763]: E1205 11:50:33.784086 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:33 crc kubenswrapper[4763]: E1205 11:50:33.784250 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:33 crc kubenswrapper[4763]: I1205 11:50:33.784279 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:33 crc kubenswrapper[4763]: E1205 11:50:33.784380 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.740868 4763 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 11:50:35 crc kubenswrapper[4763]: I1205 11:50:35.783790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:35 crc kubenswrapper[4763]: I1205 11:50:35.783853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:35 crc kubenswrapper[4763]: I1205 11:50:35.783806 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:35 crc kubenswrapper[4763]: I1205 11:50:35.784891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.784879 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.785011 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.785094 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.785160 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:35 crc kubenswrapper[4763]: E1205 11:50:35.941863 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 11:50:37 crc kubenswrapper[4763]: I1205 11:50:37.783894 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:37 crc kubenswrapper[4763]: I1205 11:50:37.783937 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:37 crc kubenswrapper[4763]: I1205 11:50:37.783999 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:37 crc kubenswrapper[4763]: I1205 11:50:37.784326 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:37 crc kubenswrapper[4763]: E1205 11:50:37.784855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:37 crc kubenswrapper[4763]: E1205 11:50:37.785082 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:37 crc kubenswrapper[4763]: E1205 11:50:37.785180 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:37 crc kubenswrapper[4763]: E1205 11:50:37.785227 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:39 crc kubenswrapper[4763]: I1205 11:50:39.783920 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:39 crc kubenswrapper[4763]: I1205 11:50:39.783979 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:39 crc kubenswrapper[4763]: I1205 11:50:39.784031 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:39 crc kubenswrapper[4763]: I1205 11:50:39.784061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:39 crc kubenswrapper[4763]: E1205 11:50:39.784085 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:39 crc kubenswrapper[4763]: E1205 11:50:39.784185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:39 crc kubenswrapper[4763]: E1205 11:50:39.784265 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:39 crc kubenswrapper[4763]: E1205 11:50:39.784378 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:40 crc kubenswrapper[4763]: E1205 11:50:40.946686 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 11:50:41 crc kubenswrapper[4763]: I1205 11:50:41.784005 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:41 crc kubenswrapper[4763]: I1205 11:50:41.784034 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:41 crc kubenswrapper[4763]: I1205 11:50:41.784087 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:41 crc kubenswrapper[4763]: I1205 11:50:41.784023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:41 crc kubenswrapper[4763]: E1205 11:50:41.784155 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:41 crc kubenswrapper[4763]: E1205 11:50:41.784350 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:41 crc kubenswrapper[4763]: E1205 11:50:41.784487 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:41 crc kubenswrapper[4763]: E1205 11:50:41.784558 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.783371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.783493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:43 crc kubenswrapper[4763]: E1205 11:50:43.783703 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.783806 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.784316 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:43 crc kubenswrapper[4763]: E1205 11:50:43.784412 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:43 crc kubenswrapper[4763]: E1205 11:50:43.784593 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.785035 4763 scope.go:117] "RemoveContainer" containerID="a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d" Dec 05 11:50:43 crc kubenswrapper[4763]: I1205 11:50:43.785081 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 11:50:43 crc kubenswrapper[4763]: E1205 11:50:43.785422 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.431344 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/1.log" Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.431405 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerStarted","Data":"6c5a2cf91a9ab67900794000630b79f7786d85e8c1fe93bc6af0f40b8aca502e"} Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.433987 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/3.log" Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.437734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerStarted","Data":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.438480 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.496028 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podStartSLOduration=109.495996581 podStartE2EDuration="1m49.495996581s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:50:44.494372779 +0000 UTC m=+128.987087502" watchObservedRunningTime="2025-12-05 11:50:44.495996581 +0000 UTC m=+128.988711384" Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.764654 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x45qv"] Dec 05 11:50:44 crc kubenswrapper[4763]: I1205 11:50:44.764825 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:44 crc kubenswrapper[4763]: E1205 11:50:44.764962 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:45 crc kubenswrapper[4763]: I1205 11:50:45.783179 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:45 crc kubenswrapper[4763]: I1205 11:50:45.783179 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:45 crc kubenswrapper[4763]: E1205 11:50:45.784580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:45 crc kubenswrapper[4763]: I1205 11:50:45.784611 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:45 crc kubenswrapper[4763]: E1205 11:50:45.784853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:45 crc kubenswrapper[4763]: E1205 11:50:45.784918 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:45 crc kubenswrapper[4763]: E1205 11:50:45.947389 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 11:50:46 crc kubenswrapper[4763]: I1205 11:50:46.782942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:46 crc kubenswrapper[4763]: E1205 11:50:46.783438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:47 crc kubenswrapper[4763]: I1205 11:50:47.783244 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:47 crc kubenswrapper[4763]: I1205 11:50:47.783309 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:47 crc kubenswrapper[4763]: E1205 11:50:47.783515 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:47 crc kubenswrapper[4763]: I1205 11:50:47.783592 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:47 crc kubenswrapper[4763]: E1205 11:50:47.783807 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:47 crc kubenswrapper[4763]: E1205 11:50:47.783953 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:48 crc kubenswrapper[4763]: I1205 11:50:48.783415 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:48 crc kubenswrapper[4763]: E1205 11:50:48.783710 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:49 crc kubenswrapper[4763]: I1205 11:50:49.783807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:49 crc kubenswrapper[4763]: I1205 11:50:49.783916 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:49 crc kubenswrapper[4763]: I1205 11:50:49.783995 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:49 crc kubenswrapper[4763]: E1205 11:50:49.783983 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 11:50:49 crc kubenswrapper[4763]: E1205 11:50:49.784101 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 11:50:49 crc kubenswrapper[4763]: E1205 11:50:49.784185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 11:50:50 crc kubenswrapper[4763]: I1205 11:50:50.784014 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:50 crc kubenswrapper[4763]: E1205 11:50:50.784275 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x45qv" podUID="a135c32b-38e4-43f6-bbb1-d1b8e42156ab" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.783397 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.783498 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.783590 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.785654 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.785668 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.786254 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 11:50:51 crc kubenswrapper[4763]: I1205 11:50:51.787207 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 11:50:52 crc kubenswrapper[4763]: I1205 11:50:52.783147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:50:52 crc kubenswrapper[4763]: I1205 11:50:52.785159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 11:50:52 crc kubenswrapper[4763]: I1205 11:50:52.785290 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.556227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.596615 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.597005 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.600680 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.600720 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78sc9"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.601095 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.601098 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.601389 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.601995 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.606822 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j5ztd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.607505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.608211 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.610296 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.611336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.611424 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4hkr\" (UniqueName: \"kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.611497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.611542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.612673 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c5hd7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.613547 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.614024 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.614682 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.618695 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.619365 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-85ch6"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.619521 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.619609 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.619900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.620325 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.622359 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.622825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.623004 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.623201 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.623326 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.623437 4763 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-config": failed to list *v1.ConfigMap: configmaps "authentication-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.623483 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"authentication-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.623706 4763 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.623738 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.623805 4763 reflector.go:561] object-"openshift-authentication-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.623822 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.623865 4763 reflector.go:561] object-"openshift-authentication-operator"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.623880 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.623923 4763 reflector.go:561] object-"openshift-authentication-operator"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.623936 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.625025 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.625178 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.625274 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.626672 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.627337 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.628154 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.628795 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.631857 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fbmpx"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.632668 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wndr8"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.633113 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.633655 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.634312 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.634789 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.635842 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.636383 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.637252 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.637419 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.639420 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdcnn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.639873 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.639897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640286 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640458 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640608 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640791 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640855 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640940 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.640968 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641052 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641060 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641163 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641186 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641334 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641338 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641373 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641432 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: W1205 11:51:00.641507 4763 reflector.go:561] object-"openshift-authentication-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.641533 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: E1205 11:51:00.641547 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.643875 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.655181 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.656895 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.657584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.659444 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.678547 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.678693 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.679381 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hv45j"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.679720 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.679743 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.680221 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.680268 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.680354 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.680415 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.680483 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.704260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.705072 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.705937 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.706917 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.706937 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.707065 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.707862 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.708246 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.708427 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.708932 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.709913 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710011 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710301 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710397 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710433 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710502 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710558 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710581 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710665 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710695 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710751 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710772 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710846 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710919 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.710999 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.711102 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.711220 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.711646 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.711864 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.712494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713373 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641223ac-c0c5-43cd-83cf-feaef76d52e6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ab8944-246e-407f-9ba0-78456103e6f4-config\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4dc83eeb-0330-445a-a742-7c3537517f8d-machine-approver-tls\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713467 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-audit\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713503 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713519 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-config\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713535 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-audit-dir\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641223ac-c0c5-43cd-83cf-feaef76d52e6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.713059 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.717329 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.717687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.718386 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.719276 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.722782 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.723818 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.723983 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724089 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724341 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724540 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.724638 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.726866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8j9\" (UniqueName: \"kubernetes.io/projected/4dc83eeb-0330-445a-a742-7c3537517f8d-kube-api-access-cb8j9\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-client\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727214 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-trusted-ca\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84047603-03db-454f-ad69-7dee76bd4e0a-audit-dir\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6hp\" (UniqueName: \"kubernetes.io/projected/80f84439-4d54-461d-9522-5bca4858f5d4-kube-api-access-pw6hp\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727435 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ba5666-f89c-4a31-90a2-57654afd4ff8-serving-cert\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727686 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d7dd240-717f-4422-a4db-b38274db085e-metrics-tls\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-client\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727802 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-serving-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-encryption-config\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.727979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728031 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c4ac72f-1389-403a-8cf1-567e4f6ac225-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-audit-policies\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-874st\" (UniqueName: \"kubernetes.io/projected/84047603-03db-454f-ad69-7dee76bd4e0a-kube-api-access-874st\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728157 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5xf\" (UniqueName: \"kubernetes.io/projected/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-kube-api-access-gb5xf\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-auth-proxy-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728239 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbqhj\" (UniqueName: \"kubernetes.io/projected/5ad8fb54-ab2a-423f-90ea-afc17b937e34-kube-api-access-tbqhj\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728303 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-serving-cert\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-images\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728385 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728451 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728470 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsjc\" (UniqueName: \"kubernetes.io/projected/641223ac-c0c5-43cd-83cf-feaef76d52e6-kube-api-access-2nsjc\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728798 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trr7c\" (UniqueName: \"kubernetes.io/projected/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-kube-api-access-trr7c\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728878 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-node-pullsecrets\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ab8944-246e-407f-9ba0-78456103e6f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.728913 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4ac72f-1389-403a-8cf1-567e4f6ac225-serving-cert\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqc8v\" (UniqueName: \"kubernetes.io/projected/1c4ac72f-1389-403a-8cf1-567e4f6ac225-kube-api-access-wqc8v\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-image-import-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729165 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2c5\" (UniqueName: \"kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729217 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729249 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-serving-cert\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729297 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729470 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4hkr\" (UniqueName: \"kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729695 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-config\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8vc\" (UniqueName: \"kubernetes.io/projected/bc6697b7-ff52-448b-9b29-e53eb649646d-kube-api-access-vc8vc\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-serving-cert\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqrj\" (UniqueName: \"kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.729971 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730002 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-service-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-client\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730139 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730277 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg798\" (UniqueName: \"kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730349 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730380 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ab8944-246e-407f-9ba0-78456103e6f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg5b2\" (UniqueName: \"kubernetes.io/projected/0d7dd240-717f-4422-a4db-b38274db085e-kube-api-access-bg5b2\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730456 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730480 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730511 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730530 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-encryption-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6c7\" (UniqueName: \"kubernetes.io/projected/c5ba5666-f89c-4a31-90a2-57654afd4ff8-kube-api-access-hw6c7\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730608 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-config\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.730900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.738781 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.739132 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.739442 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.740728 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.741159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.741428 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.741664 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.741869 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.742127 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.746819 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.747031 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.747304 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.749571 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.750322 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.750794 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.751104 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-46l22"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.751357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.751860 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.752393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.752515 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.752622 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.752703 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.753257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.753887 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7gb2n"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.761170 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.762673 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f52fv"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.764581 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.765448 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.770652 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.784992 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.788337 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.788843 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.789143 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.789609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.789897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.790043 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.793917 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.793927 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.794442 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.795495 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.796115 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.796893 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.797532 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.798696 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.799725 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.800549 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nwf6f"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.800665 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.801373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.801651 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.801833 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.802438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.804366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.804368 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.805608 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.806690 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c5hd7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.807694 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j5ztd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.809087 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78sc9"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.811909 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-85ch6"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.812266 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.813454 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdcnn"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.814599 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.815355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.815613 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.816970 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.817904 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.818965 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.820065 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wndr8"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.821292 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7dzw7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.822200 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.824981 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.825181 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-knjp7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.826776 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-46l22"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.826915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.828005 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.829055 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.830330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-encryption-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831333 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6c7\" (UniqueName: \"kubernetes.io/projected/c5ba5666-f89c-4a31-90a2-57654afd4ff8-kube-api-access-hw6c7\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-config\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641223ac-c0c5-43cd-83cf-feaef76d52e6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831405 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ab8944-246e-407f-9ba0-78456103e6f4-config\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4dc83eeb-0330-445a-a742-7c3537517f8d-machine-approver-tls\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831456 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-audit\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-audit-dir\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641223ac-c0c5-43cd-83cf-feaef76d52e6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-config\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8j9\" (UniqueName: \"kubernetes.io/projected/4dc83eeb-0330-445a-a742-7c3537517f8d-kube-api-access-cb8j9\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-client\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-trusted-ca\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831670 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6hp\" (UniqueName: \"kubernetes.io/projected/80f84439-4d54-461d-9522-5bca4858f5d4-kube-api-access-pw6hp\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ba5666-f89c-4a31-90a2-57654afd4ff8-serving-cert\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84047603-03db-454f-ad69-7dee76bd4e0a-audit-dir\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d7dd240-717f-4422-a4db-b38274db085e-metrics-tls\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-client\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831801 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-serving-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831825 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-encryption-config\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831897 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c4ac72f-1389-403a-8cf1-567e4f6ac225-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-audit-policies\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.831999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-874st\" (UniqueName: \"kubernetes.io/projected/84047603-03db-454f-ad69-7dee76bd4e0a-kube-api-access-874st\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5xf\" (UniqueName: \"kubernetes.io/projected/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-kube-api-access-gb5xf\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-auth-proxy-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbqhj\" (UniqueName: \"kubernetes.io/projected/5ad8fb54-ab2a-423f-90ea-afc17b937e34-kube-api-access-tbqhj\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-serving-cert\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-images\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832612 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsjc\" (UniqueName: \"kubernetes.io/projected/641223ac-c0c5-43cd-83cf-feaef76d52e6-kube-api-access-2nsjc\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832651 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trr7c\" (UniqueName: \"kubernetes.io/projected/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-kube-api-access-trr7c\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832670 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832693 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832711 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-node-pullsecrets\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ab8944-246e-407f-9ba0-78456103e6f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832779 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832810 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4ac72f-1389-403a-8cf1-567e4f6ac225-serving-cert\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832828 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqc8v\" (UniqueName: \"kubernetes.io/projected/1c4ac72f-1389-403a-8cf1-567e4f6ac225-kube-api-access-wqc8v\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-image-import-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2c5\" (UniqueName: \"kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832917 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-serving-cert\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832955 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832972 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.832972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-config\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833070 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8vc\" (UniqueName: \"kubernetes.io/projected/bc6697b7-ff52-448b-9b29-e53eb649646d-kube-api-access-vc8vc\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqrj\" (UniqueName: \"kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-serving-cert\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-service-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833178 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-client\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833196 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833215 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg798\" (UniqueName: \"kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833269 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ab8944-246e-407f-9ba0-78456103e6f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833365 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5b2\" (UniqueName: \"kubernetes.io/projected/0d7dd240-717f-4422-a4db-b38274db085e-kube-api-access-bg5b2\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.833856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.834294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.834724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-config\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.834955 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.835895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641223ac-c0c5-43cd-83cf-feaef76d52e6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.835974 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.836305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-audit-dir\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.839321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-config\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.839346 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-config\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.839395 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4dc83eeb-0330-445a-a742-7c3537517f8d-machine-approver-tls\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.839966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-serving-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.840039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-audit\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.840326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.840398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.840599 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc6697b7-ff52-448b-9b29-e53eb649646d-node-pullsecrets\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.841484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-encryption-config\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.841688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.842633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c4ac72f-1389-403a-8cf1-567e4f6ac225-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.843376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-audit-policies\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.843429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.843973 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.844028 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84047603-03db-454f-ad69-7dee76bd4e0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.844358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-images\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.844814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.844883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc83eeb-0330-445a-a742-7c3537517f8d-auth-proxy-config\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.845378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641223ac-c0c5-43cd-83cf-feaef76d52e6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.845725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.845921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.846351 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-serving-cert\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.846959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-encryption-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.847233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.847321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.847838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84047603-03db-454f-ad69-7dee76bd4e0a-audit-dir\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.848648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.849115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-image-import-ca\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.849239 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.849504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.849648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.850951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.851105 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-serving-cert\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.851483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d7dd240-717f-4422-a4db-b38274db085e-metrics-tls\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.851618 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.851665 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.853086 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-service-ca\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.853793 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc6697b7-ff52-448b-9b29-e53eb649646d-config\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.854052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.854089 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.854480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-etcd-client\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.854879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ba5666-f89c-4a31-90a2-57654afd4ff8-serving-cert\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.854934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.855633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5ba5666-f89c-4a31-90a2-57654afd4ff8-trusted-ca\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad8fb54-ab2a-423f-90ea-afc17b937e34-serving-cert\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857235 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/84047603-03db-454f-ad69-7dee76bd4e0a-etcd-client\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857906 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.857924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc6697b7-ff52-448b-9b29-e53eb649646d-etcd-client\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.858111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.858415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.859797 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.859922 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.860207 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.860803 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.861961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.864089 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.864752 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4ac72f-1389-403a-8cf1-567e4f6ac225-serving-cert\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.866030 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fbmpx"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.867501 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.869135 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.870483 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hv45j"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.871540 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5949w"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.873349 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-656fj"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.873492 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.874668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.874887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.874977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nwf6f"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.876015 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7dzw7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.878204 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.880028 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.880342 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7gb2n"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.882508 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.889642 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.891143 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.892574 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.894019 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.895403 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-knjp7"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.896619 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5949w"] Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.900237 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.919985 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.931119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ab8944-246e-407f-9ba0-78456103e6f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.941647 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.946886 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ab8944-246e-407f-9ba0-78456103e6f4-config\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.959720 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 11:51:00 crc kubenswrapper[4763]: I1205 11:51:00.999280 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.025205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.040044 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.045142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.059011 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.064243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.079025 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.099328 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.119021 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.139561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.158553 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.179442 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.199445 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.219296 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.239832 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.259430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.279383 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.317318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4hkr\" (UniqueName: \"kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr\") pod \"route-controller-manager-6576b87f9c-7h6vz\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.339194 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.360134 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.379692 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.404576 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.419328 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.440021 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.460057 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.479368 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.499847 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.513615 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.519140 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.539896 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.560327 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.580415 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.606351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.619321 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.640220 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.659589 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.679606 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.688161 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.699227 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.719018 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.739861 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.759879 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.778044 4763 request.go:700] Waited for 1.011657408s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.779754 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.798786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.819902 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.839022 4763 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.839176 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert podName:80f84439-4d54-461d-9522-5bca4858f5d4 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:02.339136769 +0000 UTC m=+146.831851522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert") pod "authentication-operator-69f744f599-c5hd7" (UID: "80f84439-4d54-461d-9522-5bca4858f5d4") : failed to sync secret cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.840520 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.843207 4763 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.843213 4763 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.843361 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config podName:80f84439-4d54-461d-9522-5bca4858f5d4 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:02.343318248 +0000 UTC m=+146.836032971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config") pod "authentication-operator-69f744f599-c5hd7" (UID: "80f84439-4d54-461d-9522-5bca4858f5d4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.843412 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle podName:80f84439-4d54-461d-9522-5bca4858f5d4 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:02.343383879 +0000 UTC m=+146.836098632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle") pod "authentication-operator-69f744f599-c5hd7" (UID: "80f84439-4d54-461d-9522-5bca4858f5d4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.853038 4763 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: E1205 11:51:01.853190 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle podName:80f84439-4d54-461d-9522-5bca4858f5d4 nodeName:}" failed. No retries permitted until 2025-12-05 11:51:02.353145007 +0000 UTC m=+146.845859830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle") pod "authentication-operator-69f744f599-c5hd7" (UID: "80f84439-4d54-461d-9522-5bca4858f5d4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.859893 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.880342 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.900284 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.919455 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.940436 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.960745 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.979078 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 11:51:01 crc kubenswrapper[4763]: I1205 11:51:01.999269 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.019797 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.043963 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.060188 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.079085 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.098891 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.119070 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.139988 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.159204 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.179748 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.199606 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.219459 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.240016 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.259418 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.279838 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.299361 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.319822 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.339527 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.356877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.356948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.357030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.357068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.359477 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.378660 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.400082 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.419643 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.439224 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.459748 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.479146 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.497221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" event={"ID":"96fb350e-8c37-4e8d-8233-8c3ecfac9935","Type":"ContainerStarted","Data":"011a3f65814b81db0569e60aa4b8cc8471e8a166f23aba7dd26795af9aa8bc32"} Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.497268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" event={"ID":"96fb350e-8c37-4e8d-8233-8c3ecfac9935","Type":"ContainerStarted","Data":"1dd15b883f8ea4c7d315883fb5c3fea9cc2bef47b70d343d2e332e7d90b6452f"} Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.497593 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.499201 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.514272 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.518617 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.539570 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.576274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8j9\" (UniqueName: \"kubernetes.io/projected/4dc83eeb-0330-445a-a742-7c3537517f8d-kube-api-access-cb8j9\") pod \"machine-approver-56656f9798-mvwv6\" (UID: \"4dc83eeb-0330-445a-a742-7c3537517f8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.593574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6c7\" (UniqueName: \"kubernetes.io/projected/c5ba5666-f89c-4a31-90a2-57654afd4ff8-kube-api-access-hw6c7\") pod \"console-operator-58897d9998-fbmpx\" (UID: \"c5ba5666-f89c-4a31-90a2-57654afd4ff8\") " pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.612840 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbqhj\" (UniqueName: \"kubernetes.io/projected/5ad8fb54-ab2a-423f-90ea-afc17b937e34-kube-api-access-tbqhj\") pod \"etcd-operator-b45778765-fdcnn\" (UID: \"5ad8fb54-ab2a-423f-90ea-afc17b937e34\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.628271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.640102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-874st\" (UniqueName: \"kubernetes.io/projected/84047603-03db-454f-ad69-7dee76bd4e0a-kube-api-access-874st\") pod \"apiserver-7bbb656c7d-djzqs\" (UID: \"84047603-03db-454f-ad69-7dee76bd4e0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.662630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg798\" (UniqueName: \"kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798\") pod \"oauth-openshift-558db77b4-wndr8\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.677364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8vc\" (UniqueName: \"kubernetes.io/projected/bc6697b7-ff52-448b-9b29-e53eb649646d-kube-api-access-vc8vc\") pod \"apiserver-76f77b778f-78sc9\" (UID: \"bc6697b7-ff52-448b-9b29-e53eb649646d\") " pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.692284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5xf\" (UniqueName: \"kubernetes.io/projected/f5c0a5a7-2852-4c58-8093-0d58c9354ae4-kube-api-access-gb5xf\") pod \"openshift-apiserver-operator-796bbdcf4f-xh4rn\" (UID: \"f5c0a5a7-2852-4c58-8093-0d58c9354ae4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.711526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.715216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ab8944-246e-407f-9ba0-78456103e6f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kp92r\" (UID: \"c4ab8944-246e-407f-9ba0-78456103e6f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.727622 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.732635 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsjc\" (UniqueName: \"kubernetes.io/projected/641223ac-c0c5-43cd-83cf-feaef76d52e6-kube-api-access-2nsjc\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnp9g\" (UID: \"641223ac-c0c5-43cd-83cf-feaef76d52e6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.751233 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.763276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trr7c\" (UniqueName: \"kubernetes.io/projected/0c9b5acf-ef6a-4bdd-ae32-582a80d711b5-kube-api-access-trr7c\") pod \"machine-api-operator-5694c8668f-j5ztd\" (UID: \"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.778074 4763 request.go:700] Waited for 1.930867683s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/serviceaccounts/openshift-kube-scheduler-operator/token Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.797827 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-spmwg\" (UID: \"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.817221 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2c5\" (UniqueName: \"kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5\") pod \"controller-manager-879f6c89f-sptcn\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.832650 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fbmpx"] Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.840161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqrj\" (UniqueName: \"kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj\") pod \"console-f9d7485db-rsm7h\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.855479 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5b2\" (UniqueName: \"kubernetes.io/projected/0d7dd240-717f-4422-a4db-b38274db085e-kube-api-access-bg5b2\") pod \"dns-operator-744455d44c-85ch6\" (UID: \"0d7dd240-717f-4422-a4db-b38274db085e\") " pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.872272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.875353 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqc8v\" (UniqueName: \"kubernetes.io/projected/1c4ac72f-1389-403a-8cf1-567e4f6ac225-kube-api-access-wqc8v\") pod \"openshift-config-operator-7777fb866f-cnvbz\" (UID: \"1c4ac72f-1389-403a-8cf1-567e4f6ac225\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.879615 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.896346 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.898949 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.907474 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.912489 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdcnn"] Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.915054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.921788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.940209 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.946166 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.961820 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.971011 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r"] Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.981417 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 11:51:02 crc kubenswrapper[4763]: W1205 11:51:02.988852 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad8fb54_ab2a_423f_90ea_afc17b937e34.slice/crio-b6b29bb0854fad0698d6c1d5a129780e25025686aabc381b309c6f13c1f74304 WatchSource:0}: Error finding container b6b29bb0854fad0698d6c1d5a129780e25025686aabc381b309c6f13c1f74304: Status 404 returned error can't find the container with id b6b29bb0854fad0698d6c1d5a129780e25025686aabc381b309c6f13c1f74304 Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.995026 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:02 crc kubenswrapper[4763]: I1205 11:51:02.998871 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78sc9"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.003070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:03 crc kubenswrapper[4763]: W1205 11:51:03.012394 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6697b7_ff52_448b_9b29_e53eb649646d.slice/crio-42172108c00db24d8e69e41a55d63503b3a74922adb7663ec29f873143603851 WatchSource:0}: Error finding container 42172108c00db24d8e69e41a55d63503b3a74922adb7663ec29f873143603851: Status 404 returned error can't find the container with id 42172108c00db24d8e69e41a55d63503b3a74922adb7663ec29f873143603851 Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.040165 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.049278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.058142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.060405 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.065994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4jp\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066131 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.066231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.066528 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.56651695 +0000 UTC m=+148.059231673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.068192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-service-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.079813 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.092029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f84439-4d54-461d-9522-5bca4858f5d4-serving-cert\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.104673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.109042 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.109402 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.112340 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.121263 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.128927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f84439-4d54-461d-9522-5bca4858f5d4-config\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.141469 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.145651 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.164450 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6hp\" (UniqueName: \"kubernetes.io/projected/80f84439-4d54-461d-9522-5bca4858f5d4-kube-api-access-pw6hp\") pod \"authentication-operator-69f744f599-c5hd7\" (UID: \"80f84439-4d54-461d-9522-5bca4858f5d4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.167884 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168062 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-metrics-certs\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168089 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65140763-5309-4eb6-a4e7-090b57b27744-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168211 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt24n\" (UniqueName: \"kubernetes.io/projected/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-kube-api-access-xt24n\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349a6e10-88df-46fa-b81d-439574540d28-tmpfs\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168263 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168303 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5411b17-490f-40ba-be1d-1ca72c18cdb4-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-key\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168371 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvw9\" (UniqueName: \"kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5dkq\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-kube-api-access-w5dkq\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-socket-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-registration-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168553 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvdd9\" (UniqueName: \"kubernetes.io/projected/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-kube-api-access-jvdd9\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-proxy-tls\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168631 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca4b304-6857-4cde-a8a6-9300cdb60cba-config-volume\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.168696 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.668674262 +0000 UTC m=+148.161388985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168751 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168792 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsffm\" (UniqueName: \"kubernetes.io/projected/cc131485-2c98-405f-87de-1669634ff801-kube-api-access-vsffm\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4tq\" (UniqueName: \"kubernetes.io/projected/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-kube-api-access-zs4tq\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-stats-auth\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9vr\" (UniqueName: \"kubernetes.io/projected/1ca4b304-6857-4cde-a8a6-9300cdb60cba-kube-api-access-gp9vr\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168978 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5411b17-490f-40ba-be1d-1ca72c18cdb4-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.168993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169032 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvjd\" (UniqueName: \"kubernetes.io/projected/eee26d7c-caba-4663-a3f0-924184123ae2-kube-api-access-dpvjd\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169068 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctdlb\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-kube-api-access-ctdlb\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169085 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-node-bootstrap-token\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-default-certificate\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcctd\" (UniqueName: \"kubernetes.io/projected/67e97ddb-bc83-42bb-abd1-139ef343a4b3-kube-api-access-kcctd\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169178 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ca4b304-6857-4cde-a8a6-9300cdb60cba-metrics-tls\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169251 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-plugins-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169268 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-images\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-srv-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-serving-cert\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc131485-2c98-405f-87de-1669634ff801-proxy-tls\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj7dq\" (UniqueName: \"kubernetes.io/projected/349a6e10-88df-46fa-b81d-439574540d28-kube-api-access-sj7dq\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169479 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-mountpoint-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k94f\" (UniqueName: \"kubernetes.io/projected/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-kube-api-access-7k94f\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a298178-bf52-4673-a49e-5867e4bc8267-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169562 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mz8\" (UniqueName: \"kubernetes.io/projected/be6eac77-1391-4e8a-80b6-50b345cbe132-kube-api-access-x6mz8\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169596 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169613 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pbm\" (UniqueName: \"kubernetes.io/projected/4f0faea4-5f01-435b-99f5-2c8bd064c6f7-kube-api-access-c8pbm\") pod \"migrator-59844c95c7-4lwtd\" (UID: \"4f0faea4-5f01-435b-99f5-2c8bd064c6f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-webhook-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-config\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169724 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65140763-5309-4eb6-a4e7-090b57b27744-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169802 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4jp\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-cert\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be6eac77-1391-4e8a-80b6-50b345cbe132-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169886 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-config\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169944 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-profile-collector-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.169983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdv4\" (UniqueName: \"kubernetes.io/projected/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-kube-api-access-twdv4\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170009 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mzc\" (UniqueName: \"kubernetes.io/projected/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-kube-api-access-r7mzc\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-apiservice-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdtq\" (UniqueName: \"kubernetes.io/projected/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-kube-api-access-6jdtq\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9sq9\" (UniqueName: \"kubernetes.io/projected/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-kube-api-access-r9sq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170463 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2sn4\" (UniqueName: \"kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-cabundle\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxc7\" (UniqueName: \"kubernetes.io/projected/f37a2281-17b2-486d-aae8-a58378afbfad-kube-api-access-npxc7\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnrs5\" (UniqueName: \"kubernetes.io/projected/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-kube-api-access-lnrs5\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-service-ca-bundle\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170622 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-csi-data-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170644 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-srv-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170683 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jm59\" (UniqueName: \"kubernetes.io/projected/5a298178-bf52-4673-a49e-5867e4bc8267-kube-api-access-7jm59\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170723 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-certs\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.170846 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4vn\" (UniqueName: \"kubernetes.io/projected/a5c4e5af-4c88-4770-b4a7-3eeded875431-kube-api-access-8w4vn\") pod \"downloads-7954f5f757-hv45j\" (UID: \"a5c4e5af-4c88-4770-b4a7-3eeded875431\") " pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.171111 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.671091509 +0000 UTC m=+148.163806232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.171942 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: W1205 11:51:03.171957 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641223ac_c0c5_43cd_83cf_feaef76d52e6.slice/crio-daf54716488e0ce768ae2e91ef9d6e310ba9c061514e7bdd0f087297b17012e7 WatchSource:0}: Error finding container daf54716488e0ce768ae2e91ef9d6e310ba9c061514e7bdd0f087297b17012e7: Status 404 returned error can't find the container with id daf54716488e0ce768ae2e91ef9d6e310ba9c061514e7bdd0f087297b17012e7 Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.175587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.181019 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.182228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.188280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.226101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4jp\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.232878 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.290398 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.291636 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.791614069 +0000 UTC m=+148.284328802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.294831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-webhook-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295008 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-config\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295311 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65140763-5309-4eb6-a4e7-090b57b27744-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-cert\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-config\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be6eac77-1391-4e8a-80b6-50b345cbe132-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.295980 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-profile-collector-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.296100 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mzc\" (UniqueName: \"kubernetes.io/projected/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-kube-api-access-r7mzc\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.296201 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twdv4\" (UniqueName: \"kubernetes.io/projected/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-kube-api-access-twdv4\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.296299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.296383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-apiservice-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.298094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdtq\" (UniqueName: \"kubernetes.io/projected/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-kube-api-access-6jdtq\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.300638 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.300791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9sq9\" (UniqueName: \"kubernetes.io/projected/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-kube-api-access-r9sq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.300885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-cabundle\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxc7\" (UniqueName: \"kubernetes.io/projected/f37a2281-17b2-486d-aae8-a58378afbfad-kube-api-access-npxc7\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnrs5\" (UniqueName: \"kubernetes.io/projected/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-kube-api-access-lnrs5\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2sn4\" (UniqueName: \"kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301380 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301473 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-service-ca-bundle\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-csi-data-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301679 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-srv-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jm59\" (UniqueName: \"kubernetes.io/projected/5a298178-bf52-4673-a49e-5867e4bc8267-kube-api-access-7jm59\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-certs\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.301956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4vn\" (UniqueName: \"kubernetes.io/projected/a5c4e5af-4c88-4770-b4a7-3eeded875431-kube-api-access-8w4vn\") pod \"downloads-7954f5f757-hv45j\" (UID: \"a5c4e5af-4c88-4770-b4a7-3eeded875431\") " pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302093 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-metrics-certs\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-cabundle\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302199 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65140763-5309-4eb6-a4e7-090b57b27744-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt24n\" (UniqueName: \"kubernetes.io/projected/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-kube-api-access-xt24n\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302380 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349a6e10-88df-46fa-b81d-439574540d28-tmpfs\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302408 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5411b17-490f-40ba-be1d-1ca72c18cdb4-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-key\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvw9\" (UniqueName: \"kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5dkq\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-kube-api-access-w5dkq\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-socket-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302602 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-registration-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302629 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvdd9\" (UniqueName: \"kubernetes.io/projected/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-kube-api-access-jvdd9\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302651 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-proxy-tls\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca4b304-6857-4cde-a8a6-9300cdb60cba-config-volume\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsffm\" (UniqueName: \"kubernetes.io/projected/cc131485-2c98-405f-87de-1669634ff801-kube-api-access-vsffm\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4tq\" (UniqueName: \"kubernetes.io/projected/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-kube-api-access-zs4tq\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-stats-auth\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302894 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9vr\" (UniqueName: \"kubernetes.io/projected/1ca4b304-6857-4cde-a8a6-9300cdb60cba-kube-api-access-gp9vr\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.302923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5411b17-490f-40ba-be1d-1ca72c18cdb4-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.303364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-webhook-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.305040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65140763-5309-4eb6-a4e7-090b57b27744-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.305872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-config\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.306111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-csi-data-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.299020 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-config\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.307281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be6eac77-1391-4e8a-80b6-50b345cbe132-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.309104 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.8090845 +0000 UTC m=+148.301799223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.310050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-socket-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.310805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-registration-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.311448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.312295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca4b304-6857-4cde-a8a6-9300cdb60cba-config-volume\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.313547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-service-ca-bundle\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvjd\" (UniqueName: \"kubernetes.io/projected/eee26d7c-caba-4663-a3f0-924184123ae2-kube-api-access-dpvjd\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctdlb\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-kube-api-access-ctdlb\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-node-bootstrap-token\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-default-certificate\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcctd\" (UniqueName: \"kubernetes.io/projected/67e97ddb-bc83-42bb-abd1-139ef343a4b3-kube-api-access-kcctd\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ca4b304-6857-4cde-a8a6-9300cdb60cba-metrics-tls\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-plugins-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314611 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-images\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-srv-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-serving-cert\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc131485-2c98-405f-87de-1669634ff801-proxy-tls\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj7dq\" (UniqueName: \"kubernetes.io/projected/349a6e10-88df-46fa-b81d-439574540d28-kube-api-access-sj7dq\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314945 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-mountpoint-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.314978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k94f\" (UniqueName: \"kubernetes.io/projected/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-kube-api-access-7k94f\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.315023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a298178-bf52-4673-a49e-5867e4bc8267-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.315048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mz8\" (UniqueName: \"kubernetes.io/projected/be6eac77-1391-4e8a-80b6-50b345cbe132-kube-api-access-x6mz8\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.315081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pbm\" (UniqueName: \"kubernetes.io/projected/4f0faea4-5f01-435b-99f5-2c8bd064c6f7-kube-api-access-c8pbm\") pod \"migrator-59844c95c7-4lwtd\" (UID: \"4f0faea4-5f01-435b-99f5-2c8bd064c6f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.315321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-mountpoint-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.316442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc131485-2c98-405f-87de-1669634ff801-images\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.317894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-srv-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.318279 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.318342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eee26d7c-caba-4663-a3f0-924184123ae2-plugins-dir\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.318705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349a6e10-88df-46fa-b81d-439574540d28-tmpfs\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.319161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.324703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.325067 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5411b17-490f-40ba-be1d-1ca72c18cdb4-trusted-ca\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.327304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc131485-2c98-405f-87de-1669634ff801-proxy-tls\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.327875 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.328552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-cert\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.328750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.329319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-node-bootstrap-token\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.329500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-signing-key\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.329585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-stats-auth\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.330120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-proxy-tls\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.330579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.330754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.331090 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-serving-cert\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.331218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.331930 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-profile-collector-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.332246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-metrics-certs\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.332612 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67e97ddb-bc83-42bb-abd1-139ef343a4b3-certs\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.333082 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.334534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349a6e10-88df-46fa-b81d-439574540d28-apiservice-cert\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.341895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5411b17-490f-40ba-be1d-1ca72c18cdb4-metrics-tls\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.342484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a298178-bf52-4673-a49e-5867e4bc8267-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.346449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-default-certificate\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.348428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65140763-5309-4eb6-a4e7-090b57b27744-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.349032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ca4b304-6857-4cde-a8a6-9300cdb60cba-metrics-tls\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.359815 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f37a2281-17b2-486d-aae8-a58378afbfad-srv-cert\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.364874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.370152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.372377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdtq\" (UniqueName: \"kubernetes.io/projected/8fdad3ac-2317-4b90-a9ba-cd28cf492c96-kube-api-access-6jdtq\") pod \"service-ca-9c57cc56f-nwf6f\" (UID: \"8fdad3ac-2317-4b90-a9ba-cd28cf492c96\") " pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.377508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5dkq\" (UniqueName: \"kubernetes.io/projected/f5411b17-490f-40ba-be1d-1ca72c18cdb4-kube-api-access-w5dkq\") pod \"ingress-operator-5b745b69d9-fvw7g\" (UID: \"f5411b17-490f-40ba-be1d-1ca72c18cdb4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.386830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.408996 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twdv4\" (UniqueName: \"kubernetes.io/projected/47b0fd6e-8a1d-47ce-b581-4405a98c73d0-kube-api-access-twdv4\") pod \"service-ca-operator-777779d784-rf8kp\" (UID: \"47b0fd6e-8a1d-47ce-b581-4405a98c73d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.416282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.416792 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:03.916775041 +0000 UTC m=+148.409489764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.422690 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.437580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mzc\" (UniqueName: \"kubernetes.io/projected/78139543-ef47-4ea0-bb35-9fe2bc3f99a4-kube-api-access-r7mzc\") pod \"ingress-canary-7dzw7\" (UID: \"78139543-ef47-4ea0-bb35-9fe2bc3f99a4\") " pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.442706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxc7\" (UniqueName: \"kubernetes.io/projected/f37a2281-17b2-486d-aae8-a58378afbfad-kube-api-access-npxc7\") pod \"catalog-operator-68c6474976-vjk5l\" (UID: \"f37a2281-17b2-486d-aae8-a58378afbfad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.456951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnrs5\" (UniqueName: \"kubernetes.io/projected/cea9ce07-369d-4ea5-a2ae-c77eeeaef7da-kube-api-access-lnrs5\") pod \"router-default-5444994796-f52fv\" (UID: \"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da\") " pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.475795 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2sn4\" (UniqueName: \"kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4\") pod \"collect-profiles-29415585-hp78p\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.499859 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9sq9\" (UniqueName: \"kubernetes.io/projected/f08e9226-6ec5-4854-9780-0b5e2d8a7ded-kube-api-access-r9sq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wmpg\" (UID: \"f08e9226-6ec5-4854-9780-0b5e2d8a7ded\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.500127 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.507049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.513945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.517423 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.517973 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.017957826 +0000 UTC m=+148.510672549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.523195 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7dzw7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.527339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" event={"ID":"84047603-03db-454f-ad69-7dee76bd4e0a","Type":"ContainerStarted","Data":"4900c5f45236ad29c61d8a12856cab90f7dc93e17c23b70de6cb66e105c5b833"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.529107 4763 generic.go:334] "Generic (PLEG): container finished" podID="bc6697b7-ff52-448b-9b29-e53eb649646d" containerID="b748d3dcfba1902e8cacf1f1d80e629ff4d2b2c8f1f9e7063f68f34242bd786b" exitCode=0 Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.529168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" event={"ID":"bc6697b7-ff52-448b-9b29-e53eb649646d","Type":"ContainerDied","Data":"b748d3dcfba1902e8cacf1f1d80e629ff4d2b2c8f1f9e7063f68f34242bd786b"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.529189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" event={"ID":"bc6697b7-ff52-448b-9b29-e53eb649646d","Type":"ContainerStarted","Data":"42172108c00db24d8e69e41a55d63503b3a74922adb7663ec29f873143603851"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.531455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsffm\" (UniqueName: \"kubernetes.io/projected/cc131485-2c98-405f-87de-1669634ff801-kube-api-access-vsffm\") pod \"machine-config-operator-74547568cd-qsr7t\" (UID: \"cc131485-2c98-405f-87de-1669634ff801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.537910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4tq\" (UniqueName: \"kubernetes.io/projected/10e01a12-35b2-4abc-93b4-3d3ac7ce61ed-kube-api-access-zs4tq\") pod \"package-server-manager-789f6589d5-vzzqs\" (UID: \"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.562714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvdd9\" (UniqueName: \"kubernetes.io/projected/5fbca11a-128a-4ea3-b5b9-3b9f596b887d-kube-api-access-jvdd9\") pod \"kube-storage-version-migrator-operator-b67b599dd-zssnz\" (UID: \"5fbca11a-128a-4ea3-b5b9-3b9f596b887d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.566852 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" event={"ID":"641223ac-c0c5-43cd-83cf-feaef76d52e6","Type":"ContainerStarted","Data":"b9ef6d3f1312f4ca9179c9932a0a532acae5c0a1ff31a7e7137a3ed84d495db0"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.566895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" event={"ID":"641223ac-c0c5-43cd-83cf-feaef76d52e6","Type":"ContainerStarted","Data":"daf54716488e0ce768ae2e91ef9d6e310ba9c061514e7bdd0f087297b17012e7"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.579691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" event={"ID":"4dc83eeb-0330-445a-a742-7c3537517f8d","Type":"ContainerStarted","Data":"c61ec47c66a530cebee52a37ed1f971341e4d8a5c3242a52911b82f212e0d413"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.579750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" event={"ID":"4dc83eeb-0330-445a-a742-7c3537517f8d","Type":"ContainerStarted","Data":"df070eaf3b9341f8c7024d9360f06ec91c14e0e843da560fb3ad04111fcdbb4b"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.581907 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.588163 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.588451 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" event={"ID":"5ad8fb54-ab2a-423f-90ea-afc17b937e34","Type":"ContainerStarted","Data":"06af30975a601a73a930b17d774c475377c5f80e789d847ad4cb9aaeea2f5e3f"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.588484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" event={"ID":"5ad8fb54-ab2a-423f-90ea-afc17b937e34","Type":"ContainerStarted","Data":"b6b29bb0854fad0698d6c1d5a129780e25025686aabc381b309c6f13c1f74304"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.590045 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.594938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" event={"ID":"c5ba5666-f89c-4a31-90a2-57654afd4ff8","Type":"ContainerStarted","Data":"3b08ccb67c4a22bda81be4f832264d4f1197ad8143e70af7ad545b41e3b796ed"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.595181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.595193 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" event={"ID":"c5ba5666-f89c-4a31-90a2-57654afd4ff8","Type":"ContainerStarted","Data":"7676b512e079ec6d16c2d82f47daf9432cabd6374d4f8cbe2bc722c4a4f673f9"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.595839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4vn\" (UniqueName: \"kubernetes.io/projected/a5c4e5af-4c88-4770-b4a7-3eeded875431-kube-api-access-8w4vn\") pod \"downloads-7954f5f757-hv45j\" (UID: \"a5c4e5af-4c88-4770-b4a7-3eeded875431\") " pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.598337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" event={"ID":"c4ab8944-246e-407f-9ba0-78456103e6f4","Type":"ContainerStarted","Data":"eb18928cb5e80abd3268a449b4ddd563a5f7d811d30c7d34c96cc331ddf12960"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.598375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" event={"ID":"c4ab8944-246e-407f-9ba0-78456103e6f4","Type":"ContainerStarted","Data":"41c1ba724b4d8b9da7092acb5d16f232e858687276239633447ce800f1dbefed"} Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.598619 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wndr8"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.604002 4763 patch_prober.go:28] interesting pod/console-operator-58897d9998-fbmpx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.604056 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" podUID="c5ba5666-f89c-4a31-90a2-57654afd4ff8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.618138 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.618382 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.118351425 +0000 UTC m=+148.611066158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.619238 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.621919 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj7dq\" (UniqueName: \"kubernetes.io/projected/349a6e10-88df-46fa-b81d-439574540d28-kube-api-access-sj7dq\") pod \"packageserver-d55dfcdfc-wkvgk\" (UID: \"349a6e10-88df-46fa-b81d-439574540d28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.625551 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.125512255 +0000 UTC m=+148.618226978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.642748 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: W1205 11:51:03.651866 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb600b871_2ca7_4ca9_ab49_82a77bf73b6a.slice/crio-67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85 WatchSource:0}: Error finding container 67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85: Status 404 returned error can't find the container with id 67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85 Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.655010 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.662316 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pbm\" (UniqueName: \"kubernetes.io/projected/4f0faea4-5f01-435b-99f5-2c8bd064c6f7-kube-api-access-c8pbm\") pod \"migrator-59844c95c7-4lwtd\" (UID: \"4f0faea4-5f01-435b-99f5-2c8bd064c6f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.677796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvw9\" (UniqueName: \"kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9\") pod \"marketplace-operator-79b997595-hjvs8\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.707092 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.707649 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.709086 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c5hd7"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.713647 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k94f\" (UniqueName: \"kubernetes.io/projected/fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0-kube-api-access-7k94f\") pod \"olm-operator-6b444d44fb-z52fw\" (UID: \"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.728567 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.732292 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.733038 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.733235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.733371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.734484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.735870 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.735968 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.235947044 +0000 UTC m=+148.728661767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.738469 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.742040 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j5ztd"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.745476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt24n\" (UniqueName: \"kubernetes.io/projected/24bf87f0-b1a2-47c8-9800-862e15c3f4cf-kube-api-access-xt24n\") pod \"machine-config-controller-84d6567774-46l22\" (UID: \"24bf87f0-b1a2-47c8-9800-862e15c3f4cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.748139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.754066 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.760197 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.765688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvjd\" (UniqueName: \"kubernetes.io/projected/eee26d7c-caba-4663-a3f0-924184123ae2-kube-api-access-dpvjd\") pod \"csi-hostpathplugin-5949w\" (UID: \"eee26d7c-caba-4663-a3f0-924184123ae2\") " pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.768928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9vr\" (UniqueName: \"kubernetes.io/projected/1ca4b304-6857-4cde-a8a6-9300cdb60cba-kube-api-access-gp9vr\") pod \"dns-default-knjp7\" (UID: \"1ca4b304-6857-4cde-a8a6-9300cdb60cba\") " pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.769175 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.780333 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.786060 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.796340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.796485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctdlb\" (UniqueName: \"kubernetes.io/projected/65140763-5309-4eb6-a4e7-090b57b27744-kube-api-access-ctdlb\") pod \"cluster-image-registry-operator-dc59b4c8b-ww8jh\" (UID: \"65140763-5309-4eb6-a4e7-090b57b27744\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.810703 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.811045 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd551d-eb3c-4c33-89e0-fb957dfa31dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z2mgd\" (UID: \"71bd551d-eb3c-4c33-89e0-fb957dfa31dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.816025 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mz8\" (UniqueName: \"kubernetes.io/projected/be6eac77-1391-4e8a-80b6-50b345cbe132-kube-api-access-x6mz8\") pod \"cluster-samples-operator-665b6dd947-pbkjb\" (UID: \"be6eac77-1391-4e8a-80b6-50b345cbe132\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.816492 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.816525 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-85ch6"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.816540 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp"] Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.835876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jm59\" (UniqueName: \"kubernetes.io/projected/5a298178-bf52-4673-a49e-5867e4bc8267-kube-api-access-7jm59\") pod \"multus-admission-controller-857f4d67dd-7gb2n\" (UID: \"5a298178-bf52-4673-a49e-5867e4bc8267\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.837409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.838354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.838393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.838448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.838712 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.33870025 +0000 UTC m=+148.831414973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.848985 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5949w" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.851156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.853123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.853248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcctd\" (UniqueName: \"kubernetes.io/projected/67e97ddb-bc83-42bb-abd1-139ef343a4b3-kube-api-access-kcctd\") pod \"machine-config-server-656fj\" (UID: \"67e97ddb-bc83-42bb-abd1-139ef343a4b3\") " pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.855186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-656fj" Dec 05 11:51:03 crc kubenswrapper[4763]: W1205 11:51:03.869336 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad12f838_4a9f_4b3d_9a33_1ca5bd556bb7.slice/crio-84bc5a253a3094ce5aa775cd5343468d9b5b525e003a234be52308c9c17c087c WatchSource:0}: Error finding container 84bc5a253a3094ce5aa775cd5343468d9b5b525e003a234be52308c9c17c087c: Status 404 returned error can't find the container with id 84bc5a253a3094ce5aa775cd5343468d9b5b525e003a234be52308c9c17c087c Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.939897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.942375 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7dzw7"] Dec 05 11:51:03 crc kubenswrapper[4763]: E1205 11:51:03.942706 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.442681365 +0000 UTC m=+148.935396088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.970323 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.970963 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" Dec 05 11:51:03 crc kubenswrapper[4763]: I1205 11:51:03.977673 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.013269 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.044512 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.045127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.045381 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.54537165 +0000 UTC m=+149.038086373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.058987 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nwf6f"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.098355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.104584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.153455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.153605 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.653576054 +0000 UTC m=+149.146290777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.153982 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.154272 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.654260539 +0000 UTC m=+149.146975262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.155498 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hv45j"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.206162 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.247150 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.255339 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.255483 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.755460824 +0000 UTC m=+149.248175547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.255609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.255976 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.755964357 +0000 UTC m=+149.248679080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.358614 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.359061 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.859039764 +0000 UTC m=+149.351754487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.379748 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-knjp7"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.462491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.470998 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:04.970978894 +0000 UTC m=+149.463693617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.564478 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.564646 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.064629517 +0000 UTC m=+149.557344240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.564972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.565536 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.065523123 +0000 UTC m=+149.558237846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.588568 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.648706 4763 generic.go:334] "Generic (PLEG): container finished" podID="84047603-03db-454f-ad69-7dee76bd4e0a" containerID="14e1547decba06bedc4f09d2f572b2304f086960757a8a2c325b4dd99e496a5d" exitCode=0 Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.649012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" event={"ID":"84047603-03db-454f-ad69-7dee76bd4e0a","Type":"ContainerDied","Data":"14e1547decba06bedc4f09d2f572b2304f086960757a8a2c325b4dd99e496a5d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.655873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hv45j" event={"ID":"a5c4e5af-4c88-4770-b4a7-3eeded875431","Type":"ContainerStarted","Data":"fb14d82065a42a58613ffdd072887b1474e329fdec75a5f15fa813a2c71383d4"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.661165 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" event={"ID":"47b0fd6e-8a1d-47ce-b581-4405a98c73d0","Type":"ContainerStarted","Data":"6634a975876d82e01382256e107e8210f4fff566539982898b45c3727181b63d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.665340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7dzw7" event={"ID":"78139543-ef47-4ea0-bb35-9fe2bc3f99a4","Type":"ContainerStarted","Data":"1eb820c35d279c2241a95a9113e9065348199977a8165f670684271d11934aca"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.667637 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.667798 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.167754766 +0000 UTC m=+149.660469489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.667885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" event={"ID":"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7","Type":"ContainerStarted","Data":"84bc5a253a3094ce5aa775cd5343468d9b5b525e003a234be52308c9c17c087c"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.669403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" event={"ID":"b600b871-2ca7-4ca9-ab49-82a77bf73b6a","Type":"ContainerStarted","Data":"67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.669803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.670043 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.169924291 +0000 UTC m=+149.662639014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.674192 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" event={"ID":"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88","Type":"ContainerStarted","Data":"be5914023c699fd9ff536e4aa70785ecf257da92fd1ed9f218fa70f7f6631010"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.686982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" event={"ID":"f5411b17-490f-40ba-be1d-1ca72c18cdb4","Type":"ContainerStarted","Data":"26f7ae7e42a5ecd487cb82f193375d77cb5f9507c9a5db215a2665408d79a38d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.687039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" event={"ID":"f5411b17-490f-40ba-be1d-1ca72c18cdb4","Type":"ContainerStarted","Data":"e87282e40a8e8856db090ac601bea5b305ddada556c15fcaa07d03c2dce5a48f"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.691347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" event={"ID":"80f84439-4d54-461d-9522-5bca4858f5d4","Type":"ContainerStarted","Data":"dc9fefc12da605697f4d3613e9fea12fc01c4b9b19e1ec74f360dccf48a4fd67"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.716142 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" podStartSLOduration=129.716126332 podStartE2EDuration="2m9.716126332s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:04.714738703 +0000 UTC m=+149.207453426" watchObservedRunningTime="2025-12-05 11:51:04.716126332 +0000 UTC m=+149.208841055" Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.756428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" event={"ID":"f5c0a5a7-2852-4c58-8093-0d58c9354ae4","Type":"ContainerStarted","Data":"b8724227ac59d109aab352feed4f0f642a4d31fbaab3d412724f5e564d294f6d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.756476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" event={"ID":"f5c0a5a7-2852-4c58-8093-0d58c9354ae4","Type":"ContainerStarted","Data":"511d6cb123e33416c795b4bf512d608e7d77443daaacc2278cfbdf6a56e21b60"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.757960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6f1be65e82b5a5e4b0cd4e72c850834a8b5b7b4acaddac986169a2955ba951d7"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.759626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" event={"ID":"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5","Type":"ContainerStarted","Data":"0e01f3b51522dc37dba9384d0813c83470807e1690ab81f9b592d78bf7dd4bf8"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.772106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.773032 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.273017439 +0000 UTC m=+149.765732162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.779832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" event={"ID":"cc131485-2c98-405f-87de-1669634ff801","Type":"ContainerStarted","Data":"6ddff4bca60004831981035eb68a094fd6732368f7b68354dee6aa1048474261"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.816463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" event={"ID":"4dc83eeb-0330-445a-a742-7c3537517f8d","Type":"ContainerStarted","Data":"451fd3522575482a4185b37b38fd07b22e4fe609336f04de3ae79a3b1ea93ee0"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.820735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" event={"ID":"f37a2281-17b2-486d-aae8-a58378afbfad","Type":"ContainerStarted","Data":"f77d3f7f65e35d0333f9b674bc12e35a71ca60f711da287bcdbca4bc915adc73"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.825974 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" event={"ID":"0d7dd240-717f-4422-a4db-b38274db085e","Type":"ContainerStarted","Data":"65fcd7a27bed4705bd4af68813308b75d7fdb5d5b1c994cf68b44d9e6046a59d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.859261 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.869151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rsm7h" event={"ID":"e57f38fd-b06b-447e-ad03-2a6fb918470b","Type":"ContainerStarted","Data":"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.869268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rsm7h" event={"ID":"e57f38fd-b06b-447e-ad03-2a6fb918470b","Type":"ContainerStarted","Data":"5e3b37db8b165100d38c2503568bc57f155b85acdff68380372b697a6207b768"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.870560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.873284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.873555 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.373542039 +0000 UTC m=+149.866256772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.881548 4763 generic.go:334] "Generic (PLEG): container finished" podID="1c4ac72f-1389-403a-8cf1-567e4f6ac225" containerID="a98847c7f3bee1cd3c7836292ce7cd18963e0979b0573cd74c3042b0be5c6af7" exitCode=0 Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.881717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" event={"ID":"1c4ac72f-1389-403a-8cf1-567e4f6ac225","Type":"ContainerDied","Data":"a98847c7f3bee1cd3c7836292ce7cd18963e0979b0573cd74c3042b0be5c6af7"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.881747 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" event={"ID":"1c4ac72f-1389-403a-8cf1-567e4f6ac225","Type":"ContainerStarted","Data":"3ddbc67ad6a6113c52401351902044432019d99fa33a83d86e673bf296af410e"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.893499 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5949w"] Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.899397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" event={"ID":"8fdad3ac-2317-4b90-a9ba-cd28cf492c96","Type":"ContainerStarted","Data":"59365b05e3056a433d3e3e4428a14a20317b9cea1abebe7c99174e73117da253"} Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.977270 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:04 crc kubenswrapper[4763]: E1205 11:51:04.979115 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.479089425 +0000 UTC m=+149.971804148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:04 crc kubenswrapper[4763]: I1205 11:51:04.988300 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" event={"ID":"bc6697b7-ff52-448b-9b29-e53eb649646d","Type":"ContainerStarted","Data":"bcd1b0cfc0e3051d6cdcd9969e5767549e7a823897fa4a4d27dbcce95cdc3220"} Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.004932 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnp9g" podStartSLOduration=130.004917784 podStartE2EDuration="2m10.004917784s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.003867557 +0000 UTC m=+149.496582280" watchObservedRunningTime="2025-12-05 11:51:05.004917784 +0000 UTC m=+149.497632497" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.006108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fbmpx" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.035145 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fdcnn" podStartSLOduration=130.035126445 podStartE2EDuration="2m10.035126445s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.033970227 +0000 UTC m=+149.526684970" watchObservedRunningTime="2025-12-05 11:51:05.035126445 +0000 UTC m=+149.527841168" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.081570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.083664 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.583649833 +0000 UTC m=+150.076364556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.142737 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.144535 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kp92r" podStartSLOduration=130.144512737 podStartE2EDuration="2m10.144512737s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.140092616 +0000 UTC m=+149.632807339" watchObservedRunningTime="2025-12-05 11:51:05.144512737 +0000 UTC m=+149.637227470" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.154860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.156312 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.182747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.182872 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.682853914 +0000 UTC m=+150.175568637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.183094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.183378 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.683371228 +0000 UTC m=+150.176085951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.252887 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.257187 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.257233 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.284228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.284500 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.784485292 +0000 UTC m=+150.277200015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.290339 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.385795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.386146 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.88612746 +0000 UTC m=+150.378842183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.446553 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" podStartSLOduration=130.446521191 podStartE2EDuration="2m10.446521191s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.445354453 +0000 UTC m=+149.938069176" watchObservedRunningTime="2025-12-05 11:51:05.446521191 +0000 UTC m=+149.939235914" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.486894 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.487183 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:05.987164294 +0000 UTC m=+150.479879017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.509517 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7gb2n"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.525042 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-46l22"] Dec 05 11:51:05 crc kubenswrapper[4763]: W1205 11:51:05.548886 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bf87f0_b1a2_47c8_9800_862e15c3f4cf.slice/crio-a22009a121adbc130e189214b166c58ef0558252d8602778b304d1c0a50e8a25 WatchSource:0}: Error finding container a22009a121adbc130e189214b166c58ef0558252d8602778b304d1c0a50e8a25: Status 404 returned error can't find the container with id a22009a121adbc130e189214b166c58ef0558252d8602778b304d1c0a50e8a25 Dec 05 11:51:05 crc kubenswrapper[4763]: W1205 11:51:05.560515 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-85107988e7bc2bf594142a70d6a8ad33bc012afbf06bc5f5fb629e0b336d064c WatchSource:0}: Error finding container 85107988e7bc2bf594142a70d6a8ad33bc012afbf06bc5f5fb629e0b336d064c: Status 404 returned error can't find the container with id 85107988e7bc2bf594142a70d6a8ad33bc012afbf06bc5f5fb629e0b336d064c Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.566614 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb"] Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.587694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.588043 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.088030627 +0000 UTC m=+150.580745350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: W1205 11:51:05.667218 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-8d3bb99a160d96a7356c31e3ac671493762cf383eef18a718a98f6818d7608e1 WatchSource:0}: Error finding container 8d3bb99a160d96a7356c31e3ac671493762cf383eef18a718a98f6818d7608e1: Status 404 returned error can't find the container with id 8d3bb99a160d96a7356c31e3ac671493762cf383eef18a718a98f6818d7608e1 Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.687166 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xh4rn" podStartSLOduration=131.687139128 podStartE2EDuration="2m11.687139128s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.684452159 +0000 UTC m=+150.177166902" watchObservedRunningTime="2025-12-05 11:51:05.687139128 +0000 UTC m=+150.179853851" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.688912 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.689284 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.189267012 +0000 UTC m=+150.681981735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.797995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.798564 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.298548134 +0000 UTC m=+150.791262857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.798926 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rsm7h" podStartSLOduration=130.798910156 podStartE2EDuration="2m10.798910156s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.738888358 +0000 UTC m=+150.231603081" watchObservedRunningTime="2025-12-05 11:51:05.798910156 +0000 UTC m=+150.291624879" Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.898809 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:05 crc kubenswrapper[4763]: E1205 11:51:05.899143 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.399128555 +0000 UTC m=+150.891843278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:05 crc kubenswrapper[4763]: I1205 11:51:05.911441 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvwv6" podStartSLOduration=131.9114275 podStartE2EDuration="2m11.9114275s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:05.909452657 +0000 UTC m=+150.402167380" watchObservedRunningTime="2025-12-05 11:51:05.9114275 +0000 UTC m=+150.404142223" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.000314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.000631 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.500620112 +0000 UTC m=+150.993334845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.102378 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.102911 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.602878484 +0000 UTC m=+151.095593207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.155153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" event={"ID":"71bd551d-eb3c-4c33-89e0-fb957dfa31dc","Type":"ContainerStarted","Data":"0f7f3524f958df8187598e3f31e5cf8633f45d059c888cc81f7b83d7643c72f1"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.165845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8d3bb99a160d96a7356c31e3ac671493762cf383eef18a718a98f6818d7608e1"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.179550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" event={"ID":"5a298178-bf52-4673-a49e-5867e4bc8267","Type":"ContainerStarted","Data":"ab3fe9f3f2eda841a2c8a3d93ad3140c405876dfb4e8fee402b9782952edf6f4"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.191252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" event={"ID":"5fbca11a-128a-4ea3-b5b9-3b9f596b887d","Type":"ContainerStarted","Data":"a069df107b17299f09f15e4473a65023d0b7b05141a395fffbaa6fa03eb898fd"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.204550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" event={"ID":"24bf87f0-b1a2-47c8-9800-862e15c3f4cf","Type":"ContainerStarted","Data":"a22009a121adbc130e189214b166c58ef0558252d8602778b304d1c0a50e8a25"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.205469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.207109 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.70709485 +0000 UTC m=+151.199809573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.252596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" event={"ID":"f5411b17-490f-40ba-be1d-1ca72c18cdb4","Type":"ContainerStarted","Data":"8b3923d8709a180308b3edf7d0c746d4562123bc741a7bee6ac98ec6615c0513"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.264354 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" event={"ID":"65140763-5309-4eb6-a4e7-090b57b27744","Type":"ContainerStarted","Data":"76b5e9137db8127d2260a78afa341fe3117fa3f7e36e72d01aabb4c2aef11ce6"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.270072 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5949w" event={"ID":"eee26d7c-caba-4663-a3f0-924184123ae2","Type":"ContainerStarted","Data":"88467625bf16dc95e4d1556421f97cbef3b7f88be7b2a817ebb287d8ae854656"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.310150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hv45j" event={"ID":"a5c4e5af-4c88-4770-b4a7-3eeded875431","Type":"ContainerStarted","Data":"2b516f82d16759152e46393a6a0976f61e4bdd78a44e2459b70e7deea55b981f"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.310787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.310874 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.310946 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.810930394 +0000 UTC m=+151.303645107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.312008 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-hv45j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.312149 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hv45j" podUID="a5c4e5af-4c88-4770-b4a7-3eeded875431" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.312414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.313275 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.81326487 +0000 UTC m=+151.305979593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.353737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" event={"ID":"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0","Type":"ContainerStarted","Data":"8f68620667ae114630b0a1f41b7fcbd1091cebb112a588f344509f6662509966"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.354024 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" event={"ID":"fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0","Type":"ContainerStarted","Data":"0aa7b9c2a6840aba384c4a4fcc44b1db81c464e0389695320b7b666fe7008c9c"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.354753 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.369908 4763 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-z52fw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.369976 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" podUID="fa5d0f88-d0b4-4a0b-9574-81095f7ea8c0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.402123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" event={"ID":"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed","Type":"ContainerStarted","Data":"8259cc657072bbc56e2cc8dbc45c8a57b4c5a19424172ecc735c93599cd81389"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.414510 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.415867 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:06.915851345 +0000 UTC m=+151.408566068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.446560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" event={"ID":"cc131485-2c98-405f-87de-1669634ff801","Type":"ContainerStarted","Data":"7ccc4136fee95639f6035f49d0382b189c37add7f06227e2371b84483b4c8747"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.518784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.521560 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.021533681 +0000 UTC m=+151.514248404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.522104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" event={"ID":"47b0fd6e-8a1d-47ce-b581-4405a98c73d0","Type":"ContainerStarted","Data":"d378715dd82dcd4e318c1c7ba0e8fa54e4a8d2c6ddd1debe8813ceaa23727b96"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.567115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" event={"ID":"8fdad3ac-2317-4b90-a9ba-cd28cf492c96","Type":"ContainerStarted","Data":"a00ae4927e2258a95d7560571bdf86eb6811522875c3ba37a7e0825fa494099c"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.572249 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fvw7g" podStartSLOduration=131.572229854 podStartE2EDuration="2m11.572229854s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.546591596 +0000 UTC m=+151.039306339" watchObservedRunningTime="2025-12-05 11:51:06.572229854 +0000 UTC m=+151.064944567" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.597270 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e036bb95e42da82f7f23e714e97d415a2a2af4166ec81d3e641e4a2897b95651"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.630654 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" podStartSLOduration=131.630636441 podStartE2EDuration="2m11.630636441s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.628173924 +0000 UTC m=+151.120888647" watchObservedRunningTime="2025-12-05 11:51:06.630636441 +0000 UTC m=+151.123351164" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.631394 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hv45j" podStartSLOduration=131.631388716 podStartE2EDuration="2m11.631388716s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.596239791 +0000 UTC m=+151.088954514" watchObservedRunningTime="2025-12-05 11:51:06.631388716 +0000 UTC m=+151.124103439" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.634642 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.637340 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.137316378 +0000 UTC m=+151.630031101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.640928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" event={"ID":"f37a2281-17b2-486d-aae8-a58378afbfad","Type":"ContainerStarted","Data":"a70ba5b54333af8d545e62f57a151e7d0a92d0fb20c8e69a309839fa10001b16"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.640992 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.666592 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rf8kp" podStartSLOduration=131.666575741 podStartE2EDuration="2m11.666575741s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.664850029 +0000 UTC m=+151.157564762" watchObservedRunningTime="2025-12-05 11:51:06.666575741 +0000 UTC m=+151.159290464" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.689522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" event={"ID":"349a6e10-88df-46fa-b81d-439574540d28","Type":"ContainerStarted","Data":"50ee78367ae6d180c81edefe6040bbb2e4f254ac8503e18dd41a51a85e75d64d"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.714217 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.737681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.737962 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" event={"ID":"fda8546a-e13c-4450-9faa-a0e0fcacbfa1","Type":"ContainerStarted","Data":"a118ed0666cfac5865b8cf13e2353fb2f762383d409f06f2199efc6c24bae8bf"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.737990 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" event={"ID":"fda8546a-e13c-4450-9faa-a0e0fcacbfa1","Type":"ContainerStarted","Data":"f5c31bb1f6a3971be371d88c4e3041e807ef63379876d0c0d5119dba720a9986"} Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.739838 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.239827582 +0000 UTC m=+151.732542305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.792700 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nwf6f" podStartSLOduration=131.79268123 podStartE2EDuration="2m11.79268123s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.760612887 +0000 UTC m=+151.253327610" watchObservedRunningTime="2025-12-05 11:51:06.79268123 +0000 UTC m=+151.285395953" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.793200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" event={"ID":"1c4ac72f-1389-403a-8cf1-567e4f6ac225","Type":"ContainerStarted","Data":"4c6d2440b32a1801340013a5f0acc52365ef5fe164af29ea92b6ab5692f6b8b8"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.793830 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.809420 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" podStartSLOduration=131.809399947 podStartE2EDuration="2m11.809399947s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.794038019 +0000 UTC m=+151.286752742" watchObservedRunningTime="2025-12-05 11:51:06.809399947 +0000 UTC m=+151.302114670" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.821977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" event={"ID":"f08e9226-6ec5-4854-9780-0b5e2d8a7ded","Type":"ContainerStarted","Data":"dab2547a86020aeca2d0448b2bfd319779c025b3812df0358291f7293bf2e449"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.841989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"85107988e7bc2bf594142a70d6a8ad33bc012afbf06bc5f5fb629e0b336d064c"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.842885 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.843404 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.343389413 +0000 UTC m=+151.836104136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.873382 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" podStartSLOduration=132.873364462 podStartE2EDuration="2m12.873364462s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.872736538 +0000 UTC m=+151.365451261" watchObservedRunningTime="2025-12-05 11:51:06.873364462 +0000 UTC m=+151.366079185" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.875384 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vjk5l" podStartSLOduration=131.875376736 podStartE2EDuration="2m11.875376736s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.842252315 +0000 UTC m=+151.334967038" watchObservedRunningTime="2025-12-05 11:51:06.875376736 +0000 UTC m=+151.368091479" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.882302 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" event={"ID":"84047603-03db-454f-ad69-7dee76bd4e0a","Type":"ContainerStarted","Data":"0dadfc218921b4ba9ed07e50103303b697bec3978a97c93c1827bd0df752b990"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.934099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" event={"ID":"b600b871-2ca7-4ca9-ab49-82a77bf73b6a","Type":"ContainerStarted","Data":"57a1fda41b6aca3107599caac8502a683367d23210617ad68d83342fc7eddc4c"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.935092 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.945878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:06 crc kubenswrapper[4763]: E1205 11:51:06.948290 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.448278794 +0000 UTC m=+151.940993517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.967352 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" podStartSLOduration=131.967337767 podStartE2EDuration="2m11.967337767s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:06.965629635 +0000 UTC m=+151.458344358" watchObservedRunningTime="2025-12-05 11:51:06.967337767 +0000 UTC m=+151.460052480" Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.972324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f52fv" event={"ID":"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da","Type":"ContainerStarted","Data":"bf863c853f5bfefed63c1e5c591e25a30d9922588b729c0a045a22e4574cda3d"} Dec 05 11:51:06 crc kubenswrapper[4763]: I1205 11:51:06.972547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f52fv" event={"ID":"cea9ce07-369d-4ea5-a2ae-c77eeeaef7da","Type":"ContainerStarted","Data":"2df30a4270b075d2a2bde969a4703cb69eb7cbfda5b0395067f3904c05990113"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.023069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerStarted","Data":"49dc0a2e0d7d3417c0b2d05af702c20fb07512e38a5c10b3435cb10526866af3"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.024037 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.025366 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hjvs8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.025398 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.050787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.051663 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.551647524 +0000 UTC m=+152.044362247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.057751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7dzw7" event={"ID":"78139543-ef47-4ea0-bb35-9fe2bc3f99a4","Type":"ContainerStarted","Data":"4248470a969e3093cf7c8bbe468376b6dc583379d8a2ebffdacc91ec8d3e125a"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.112322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-knjp7" event={"ID":"1ca4b304-6857-4cde-a8a6-9300cdb60cba","Type":"ContainerStarted","Data":"ddb320a87f6493a5174385636ca065c8a0f062d6779e6110ed44b7149cb04916"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.112392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-knjp7" event={"ID":"1ca4b304-6857-4cde-a8a6-9300cdb60cba","Type":"ContainerStarted","Data":"c8a68a775893ea73bcbe01a53df4da0a2f1f9f84196e2a444aeb917edb182ce7"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.114063 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" podStartSLOduration=133.114038059 podStartE2EDuration="2m13.114038059s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.051285552 +0000 UTC m=+151.544000275" watchObservedRunningTime="2025-12-05 11:51:07.114038059 +0000 UTC m=+151.606752792" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.152749 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.153108 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.653095621 +0000 UTC m=+152.145810344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.164998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" event={"ID":"4f0faea4-5f01-435b-99f5-2c8bd064c6f7","Type":"ContainerStarted","Data":"bddf752bcc35dc823c880b51b511562f2f80ef26a9b8f8336c0ece6f7572a395"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.183865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" event={"ID":"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88","Type":"ContainerStarted","Data":"528270db66d5934466aecbc226025f50d5c67f644e02ab402eb88c67b09d7130"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.184839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.187568 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f52fv" podStartSLOduration=132.187556321 podStartE2EDuration="2m12.187556321s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.126727387 +0000 UTC m=+151.619442110" watchObservedRunningTime="2025-12-05 11:51:07.187556321 +0000 UTC m=+151.680271044" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.204170 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.250343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" event={"ID":"be6eac77-1391-4e8a-80b6-50b345cbe132","Type":"ContainerStarted","Data":"b169848a313f30c94b004d93a5deac342a3e1efa167e849967e8e4288154fe11"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.254996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.256267 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.75624662 +0000 UTC m=+152.248961353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.286549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" event={"ID":"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5","Type":"ContainerStarted","Data":"20d1c7d84dfd49e66793b1ba21e905cd2b4a037c235cce13843acf8812543c7e"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.286598 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" event={"ID":"0c9b5acf-ef6a-4bdd-ae32-582a80d711b5","Type":"ContainerStarted","Data":"93345a15d763461a0f1426206049b716ab9c3ce7da3988a363e71d70fe78fba3"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.324705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-656fj" event={"ID":"67e97ddb-bc83-42bb-abd1-139ef343a4b3","Type":"ContainerStarted","Data":"ba4ef4c06140e3c694d66111944198828a4bf871fc42a012d9a486b17021445a"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.324745 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-656fj" event={"ID":"67e97ddb-bc83-42bb-abd1-139ef343a4b3","Type":"ContainerStarted","Data":"4713e9a05b5efd2b53276c9b2a2d2ea75eda44a67267a865deb81636656373eb"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.336637 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" event={"ID":"ad12f838-4a9f-4b3d-9a33-1ca5bd556bb7","Type":"ContainerStarted","Data":"69d3f15568072244bba3ee38ece4cd8c3b069013a08eb09ac41241913b945c08"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.337671 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" podStartSLOduration=132.337657057 podStartE2EDuration="2m12.337657057s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.191278937 +0000 UTC m=+151.683993660" watchObservedRunningTime="2025-12-05 11:51:07.337657057 +0000 UTC m=+151.830371780" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.339951 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" podStartSLOduration=132.339943973 podStartE2EDuration="2m12.339943973s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.334147772 +0000 UTC m=+151.826862495" watchObservedRunningTime="2025-12-05 11:51:07.339943973 +0000 UTC m=+151.832658696" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.358359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.358649 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.858637343 +0000 UTC m=+152.351352066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.365085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" event={"ID":"0d7dd240-717f-4422-a4db-b38274db085e","Type":"ContainerStarted","Data":"772a8c6552c0388b064358d482788d1ed00177cdbfdfc6f401d235d6f52cfa2e"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.395410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" event={"ID":"bc6697b7-ff52-448b-9b29-e53eb649646d","Type":"ContainerStarted","Data":"390b51f8cfae5b627ba8864e3064e0595f68dfc20cac88a3b4834078f9a883b7"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.397579 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7dzw7" podStartSLOduration=7.397562004 podStartE2EDuration="7.397562004s" podCreationTimestamp="2025-12-05 11:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.395381069 +0000 UTC m=+151.888095792" watchObservedRunningTime="2025-12-05 11:51:07.397562004 +0000 UTC m=+151.890276737" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.397983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" event={"ID":"80f84439-4d54-461d-9522-5bca4858f5d4","Type":"ContainerStarted","Data":"6ec64f79e184ba3ce46c1197ab038c89c20680d7a734f1bcdc5086e6b59d023c"} Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.461515 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.463345 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:07.963326472 +0000 UTC m=+152.456041205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.537818 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-656fj" podStartSLOduration=7.537800431 podStartE2EDuration="7.537800431s" podCreationTimestamp="2025-12-05 11:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.535144343 +0000 UTC m=+152.027859066" watchObservedRunningTime="2025-12-05 11:51:07.537800431 +0000 UTC m=+152.030515164" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.545269 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.545331 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.563641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.564025 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.064011754 +0000 UTC m=+152.556726477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.665946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.666147 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.166120555 +0000 UTC m=+152.658835278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.666568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.667195 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.167179103 +0000 UTC m=+152.659893826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.681374 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-j5ztd" podStartSLOduration=132.681357531 podStartE2EDuration="2m12.681357531s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.59509945 +0000 UTC m=+152.087814203" watchObservedRunningTime="2025-12-05 11:51:07.681357531 +0000 UTC m=+152.174072244" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.748436 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" podStartSLOduration=133.748418139 podStartE2EDuration="2m13.748418139s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.746645406 +0000 UTC m=+152.239360139" watchObservedRunningTime="2025-12-05 11:51:07.748418139 +0000 UTC m=+152.241132872" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.749362 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-spmwg" podStartSLOduration=132.749355885 podStartE2EDuration="2m12.749355885s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.683351935 +0000 UTC m=+152.176066658" watchObservedRunningTime="2025-12-05 11:51:07.749355885 +0000 UTC m=+152.242070608" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.750250 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.750280 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.751380 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.761647 4763 patch_prober.go:28] interesting pod/apiserver-76f77b778f-78sc9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]log ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]etcd ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/max-in-flight-filter ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 11:51:07 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 11:51:07 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-startinformers ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 11:51:07 crc kubenswrapper[4763]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 11:51:07 crc kubenswrapper[4763]: livez check failed Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.761692 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" podUID="bc6697b7-ff52-448b-9b29-e53eb649646d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.764828 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:07 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:07 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:07 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.764865 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.773073 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.773470 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.273451783 +0000 UTC m=+152.766166506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.877382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.877749 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.377733819 +0000 UTC m=+152.870448542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.909182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.909494 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.949935 4763 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wndr8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.950213 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.980412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.980542 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.480523265 +0000 UTC m=+152.973237988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:07 crc kubenswrapper[4763]: I1205 11:51:07.980745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:07 crc kubenswrapper[4763]: E1205 11:51:07.981007 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.481000028 +0000 UTC m=+152.973714741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.081541 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.081772 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.58172879 +0000 UTC m=+153.074443523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.081837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.082187 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.582177593 +0000 UTC m=+153.074892316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.183326 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.183488 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.683466379 +0000 UTC m=+153.176181112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.183591 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.183966 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.683856371 +0000 UTC m=+153.176571094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.284160 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.284376 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.784330311 +0000 UTC m=+153.277045034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.284502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.284811 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.784792245 +0000 UTC m=+153.277506968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.386252 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.386431 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.886402173 +0000 UTC m=+153.379116886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.386537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.387064 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.887057017 +0000 UTC m=+153.379771740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.405260 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.412732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerStarted","Data":"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.413423 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hjvs8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.413467 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.416413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" event={"ID":"65140763-5309-4eb6-a4e7-090b57b27744","Type":"ContainerStarted","Data":"c54ac80c4414e7fabbdd4dbd7e57319f37d15fdc0c1667f0da934081650cf7c6"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.421053 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" event={"ID":"349a6e10-88df-46fa-b81d-439574540d28","Type":"ContainerStarted","Data":"a774159d8f04f272c3d02a86a1dc7d1235208cf64925cd27e4eb6dd5db2d4c6f"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.421810 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.423558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5949w" event={"ID":"eee26d7c-caba-4663-a3f0-924184123ae2","Type":"ContainerStarted","Data":"188a20cc65d9d0fbdfb4b7d07e0c76a5a889d1a058a26e0a085be0127f4e5538"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.430039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-knjp7" event={"ID":"1ca4b304-6857-4cde-a8a6-9300cdb60cba","Type":"ContainerStarted","Data":"092bd5e79fb8724d2644be891ee12cf8a5a4383e9b847b2788cd8e799baa816e"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.430581 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.440739 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" event={"ID":"71bd551d-eb3c-4c33-89e0-fb957dfa31dc","Type":"ContainerStarted","Data":"f89c50fa2be23279f20ce5fd6e9619cb8556fdaeab046f2089597b2a9f9a1e14"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.442193 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-c5hd7" podStartSLOduration=134.442172621 podStartE2EDuration="2m14.442172621s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:07.787206849 +0000 UTC m=+152.279921592" watchObservedRunningTime="2025-12-05 11:51:08.442172621 +0000 UTC m=+152.934887354" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.442313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" event={"ID":"f08e9226-6ec5-4854-9780-0b5e2d8a7ded","Type":"ContainerStarted","Data":"c65f8bb0a1a83515452285e6801b878d3dd44939a0584ebf95a228fd31848752"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.442625 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.443835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.444630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" event={"ID":"0d7dd240-717f-4422-a4db-b38274db085e","Type":"ContainerStarted","Data":"fcf692d7d9d40cb70ed4f21b4c25d9fa85fee0d8463c1b0d21efac2ff9b1fab4"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.447107 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" event={"ID":"5fbca11a-128a-4ea3-b5b9-3b9f596b887d","Type":"ContainerStarted","Data":"d8682914f60e6b0412c5afa4bf87f1984f92e704294c7b3a13d9c59c449f08cd"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.448994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8cf4c03970028dde3680575c3b2edd60d87c6f81397b441416b44d3379b2dd62"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.449390 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.450510 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" event={"ID":"4f0faea4-5f01-435b-99f5-2c8bd064c6f7","Type":"ContainerStarted","Data":"c726c0d2d7fb35d05f770aa35bf48fdec11ea6ead22e57c2ac8cae751d586d64"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.450552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" event={"ID":"4f0faea4-5f01-435b-99f5-2c8bd064c6f7","Type":"ContainerStarted","Data":"a6ce289a6da4bfb90025dff6f6c37245082ce791667cdb9da601dbee4b3ad55d"} Dec 05 11:51:08 crc kubenswrapper[4763]: W1205 11:51:08.451005 4763 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.451032 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.454795 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" event={"ID":"be6eac77-1391-4e8a-80b6-50b345cbe132","Type":"ContainerStarted","Data":"5c24fd0d30f1a56d77576593741e7866aaf3a3b4258e93fb0a009cae5610b8bc"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.454833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" event={"ID":"be6eac77-1391-4e8a-80b6-50b345cbe132","Type":"ContainerStarted","Data":"f643b4f21b887e259aba8b78d3f0f923b778abafeecd28a860cd93fa65a17c2d"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.457099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d4e6488a2625740369a04c206457559984d231cbbc12ce7c77234096ea155568"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.462905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" event={"ID":"cc131485-2c98-405f-87de-1669634ff801","Type":"ContainerStarted","Data":"4788af2ca79a5d96a6429f4f7b0e4365135de9324cd11620e2e75085520c4c2e"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.464977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" event={"ID":"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed","Type":"ContainerStarted","Data":"0c900e220f30e06422522c6390d7fbb029541e63496f47c1972817d04f143c9d"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.465020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" event={"ID":"10e01a12-35b2-4abc-93b4-3d3ac7ce61ed","Type":"ContainerStarted","Data":"3c56343ba6c6a0fb7b4897ee4680809dd6bbe643f316e02f02261dd37d2454ef"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.465369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.468399 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" event={"ID":"5a298178-bf52-4673-a49e-5867e4bc8267","Type":"ContainerStarted","Data":"16c282a94cec62daf2de7c663ef6a6f30cf6708ecdd969035f281c093d1230ca"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.468424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" event={"ID":"5a298178-bf52-4673-a49e-5867e4bc8267","Type":"ContainerStarted","Data":"fb294090b7df563949a18c2f95d8cc5e598c44973ca7a367f03e66d6dc157f92"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.471418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" event={"ID":"24bf87f0-b1a2-47c8-9800-862e15c3f4cf","Type":"ContainerStarted","Data":"4e1984424a4489a7c195f8a4df923fea27a41d9674ac8dcca618d7b4c6e2ab7b"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.471441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" event={"ID":"24bf87f0-b1a2-47c8-9800-862e15c3f4cf","Type":"ContainerStarted","Data":"2c05bd199208bb97ae4e90b6460acd4d9a049d1b42d38445ee9c101e662b345e"} Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.477384 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-hv45j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.477420 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hv45j" podUID="a5c4e5af-4c88-4770-b4a7-3eeded875431" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.492063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.492679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.492967 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.493239 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws9t9\" (UniqueName: \"kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.494449 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.496184 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.496982 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:08.996936213 +0000 UTC m=+153.489650986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.506172 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z52fw" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.524700 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-djzqs" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.572726 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.573303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.581953 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.587785 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-85ch6" podStartSLOduration=133.587742005 podStartE2EDuration="2m13.587742005s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:08.581660353 +0000 UTC m=+153.074375086" watchObservedRunningTime="2025-12-05 11:51:08.587742005 +0000 UTC m=+153.080456738" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.590289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cnvbz" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594463 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws9t9\" (UniqueName: \"kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.594846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.594993 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.094981186 +0000 UTC m=+153.587695909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.595095 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.603110 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.604713 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.651436 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.652521 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.678165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws9t9\" (UniqueName: \"kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9\") pod \"community-operators-lzlbl\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.705031 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.705448 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.705669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.709583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.709849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.717573 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.723851 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.223812883 +0000 UTC m=+153.716527646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.746984 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:08 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:08 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:08 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.747046 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.790452 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.802646 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.803598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819105 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwgtm\" (UniqueName: \"kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819172 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819194 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbc58\" (UniqueName: \"kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.819233 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.819504 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.31949321 +0000 UTC m=+153.812207923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.864534 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsr7t" podStartSLOduration=133.864509934 podStartE2EDuration="2m13.864509934s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:08.847039752 +0000 UTC m=+153.339754495" watchObservedRunningTime="2025-12-05 11:51:08.864509934 +0000 UTC m=+153.357224657" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.866974 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.898166 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.944670 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.944871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbc58\" (UniqueName: \"kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945107 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945139 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwgtm\" (UniqueName: \"kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: E1205 11:51:08.945503 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.445488518 +0000 UTC m=+153.938203241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.945855 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.946054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.946122 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.946292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.956517 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" podStartSLOduration=133.956499744 podStartE2EDuration="2m13.956499744s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:08.954139558 +0000 UTC m=+153.446854281" watchObservedRunningTime="2025-12-05 11:51:08.956499744 +0000 UTC m=+153.449214467" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.977290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwgtm\" (UniqueName: \"kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm\") pod \"certified-operators-lpk7r\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:08 crc kubenswrapper[4763]: I1205 11:51:08.991682 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7gb2n" podStartSLOduration=133.991645739 podStartE2EDuration="2m13.991645739s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:08.989264163 +0000 UTC m=+153.481978896" watchObservedRunningTime="2025-12-05 11:51:08.991645739 +0000 UTC m=+153.484360462" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.004486 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.010059 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbc58\" (UniqueName: \"kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58\") pod \"community-operators-2hjj6\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.013021 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.014409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.028497 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.055142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.055425 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.555413374 +0000 UTC m=+154.048128097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.079572 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z2mgd" podStartSLOduration=134.079558062 podStartE2EDuration="2m14.079558062s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.078126942 +0000 UTC m=+153.570841665" watchObservedRunningTime="2025-12-05 11:51:09.079558062 +0000 UTC m=+153.572272785" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.135041 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pbkjb" podStartSLOduration=135.135025778 podStartE2EDuration="2m15.135025778s" podCreationTimestamp="2025-12-05 11:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.108140571 +0000 UTC m=+153.600855294" watchObservedRunningTime="2025-12-05 11:51:09.135025778 +0000 UTC m=+153.627740501" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.156660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.156858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m52p\" (UniqueName: \"kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.156901 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.156945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.157050 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.657036382 +0000 UTC m=+154.149751105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.195431 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lwtd" podStartSLOduration=134.195415929 podStartE2EDuration="2m14.195415929s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.137286884 +0000 UTC m=+153.630001607" watchObservedRunningTime="2025-12-05 11:51:09.195415929 +0000 UTC m=+153.688130652" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.195826 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zssnz" podStartSLOduration=134.195821402 podStartE2EDuration="2m14.195821402s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.193115313 +0000 UTC m=+153.685830026" watchObservedRunningTime="2025-12-05 11:51:09.195821402 +0000 UTC m=+153.688536125" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.258325 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-46l22" podStartSLOduration=134.258308887 podStartE2EDuration="2m14.258308887s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.217326492 +0000 UTC m=+153.710041225" watchObservedRunningTime="2025-12-05 11:51:09.258308887 +0000 UTC m=+153.751023610" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.259668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.259721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.259749 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.262750 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m52p\" (UniqueName: \"kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.263181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.263385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.263614 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.763603194 +0000 UTC m=+154.256317917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.292779 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" podStartSLOduration=134.292748677 podStartE2EDuration="2m14.292748677s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.259065702 +0000 UTC m=+153.751780425" watchObservedRunningTime="2025-12-05 11:51:09.292748677 +0000 UTC m=+153.785463400" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.309679 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m52p\" (UniqueName: \"kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p\") pod \"certified-operators-tqrj4\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.333750 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.365196 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.365526 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.865509574 +0000 UTC m=+154.358224297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.371392 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ww8jh" podStartSLOduration=134.371372005 podStartE2EDuration="2m14.371372005s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.35633733 +0000 UTC m=+153.849052063" watchObservedRunningTime="2025-12-05 11:51:09.371372005 +0000 UTC m=+153.864086728" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.391155 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wmpg" podStartSLOduration=134.391140093 podStartE2EDuration="2m14.391140093s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.389222019 +0000 UTC m=+153.881936742" watchObservedRunningTime="2025-12-05 11:51:09.391140093 +0000 UTC m=+153.883854806" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.427951 4763 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wkvgk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.428008 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" podUID="349a6e10-88df-46fa-b81d-439574540d28" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.465204 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-knjp7" podStartSLOduration=9.465186048 podStartE2EDuration="9.465186048s" podCreationTimestamp="2025-12-05 11:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:09.433451377 +0000 UTC m=+153.926166100" watchObservedRunningTime="2025-12-05 11:51:09.465186048 +0000 UTC m=+153.957900771" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.467507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.467860 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:09.967847877 +0000 UTC m=+154.460562600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.490055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.513859 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.514390 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.595224 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.603162 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.605267 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.105248884 +0000 UTC m=+154.597963607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.637041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5949w" event={"ID":"eee26d7c-caba-4663-a3f0-924184123ae2","Type":"ContainerStarted","Data":"e1498747f7aea42bee127e0de385ce1c4ef13b6de4f96f6ee268299172ed94be"} Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.637077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5949w" event={"ID":"eee26d7c-caba-4663-a3f0-924184123ae2","Type":"ContainerStarted","Data":"0a24b0a70e0e454c3593889f76f463265d52d5bd58ee97176087aa89c2f95663"} Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.653843 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-hv45j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.667583 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hv45j" podUID="a5c4e5af-4c88-4770-b4a7-3eeded875431" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.689111 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.711800 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.733594 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.233577288 +0000 UTC m=+154.726292011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.741061 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.743002 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:09 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:09 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:09 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.743043 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.823041 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.323016382 +0000 UTC m=+154.815731105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.824063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.824330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.824640 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.324629093 +0000 UTC m=+154.817343816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.881708 4763 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.925633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.925806 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.425778137 +0000 UTC m=+154.918492860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.925850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:09 crc kubenswrapper[4763]: E1205 11:51:09.926222 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.426216 +0000 UTC m=+154.918930723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpf4b" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:09 crc kubenswrapper[4763]: I1205 11:51:09.950635 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.006718 4763 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T11:51:09.881727541Z","Handler":null,"Name":""} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.029924 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:10 crc kubenswrapper[4763]: E1205 11:51:10.030261 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 11:51:10.530245985 +0000 UTC m=+155.022960708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.034620 4763 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.034649 4763 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.131972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.144609 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.144645 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.158105 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wkvgk" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.190898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpf4b\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.219023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.219918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.233317 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 11:51:10 crc kubenswrapper[4763]: W1205 11:51:10.253573 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8212fb62_9829_4198_8833_0695b17d2a5d.slice/crio-767663780b9e0dececfe16f243bf558d7aa9be56db4728787b39f90e1253ae68 WatchSource:0}: Error finding container 767663780b9e0dececfe16f243bf558d7aa9be56db4728787b39f90e1253ae68: Status 404 returned error can't find the container with id 767663780b9e0dececfe16f243bf558d7aa9be56db4728787b39f90e1253ae68 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.259235 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.379240 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.380429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.390108 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.392314 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.435928 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hh56\" (UniqueName: \"kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.435983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.436074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.459430 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.534285 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.536697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hh56\" (UniqueName: \"kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.536739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.536822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.537381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.537425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: W1205 11:51:10.539482 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa31a254_af8d_4f9f_b22d_1844d7d60382.slice/crio-ca2e55b218df9765469d149a47aa254308347bf936552dab4e3d577132d0d7c9 WatchSource:0}: Error finding container ca2e55b218df9765469d149a47aa254308347bf936552dab4e3d577132d0d7c9: Status 404 returned error can't find the container with id ca2e55b218df9765469d149a47aa254308347bf936552dab4e3d577132d0d7c9 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.557173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hh56\" (UniqueName: \"kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56\") pod \"redhat-marketplace-rtp7g\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.641847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" event={"ID":"aa31a254-af8d-4f9f-b22d-1844d7d60382","Type":"ContainerStarted","Data":"ca2e55b218df9765469d149a47aa254308347bf936552dab4e3d577132d0d7c9"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.643253 4763 generic.go:334] "Generic (PLEG): container finished" podID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerID="f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a" exitCode=0 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.643309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerDied","Data":"f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.643324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerStarted","Data":"a68a5139263f291fb28765d172801eb4bd1759dcf148f0ae7a665a9ad6801913"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.644917 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.646964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5949w" event={"ID":"eee26d7c-caba-4663-a3f0-924184123ae2","Type":"ContainerStarted","Data":"5e891509356fb55277dddccc56bf25fafeb8d0499d0fce638aeb7f12b03ccbf9"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.648472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerStarted","Data":"767663780b9e0dececfe16f243bf558d7aa9be56db4728787b39f90e1253ae68"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.649480 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerID="84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c" exitCode=0 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.649515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerDied","Data":"84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.649530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerStarted","Data":"869b6957bfbd33d3a34e683316227001bea6c14cbb71390e05daac30e716699c"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.653152 4763 generic.go:334] "Generic (PLEG): container finished" podID="fda8546a-e13c-4450-9faa-a0e0fcacbfa1" containerID="a118ed0666cfac5865b8cf13e2353fb2f762383d409f06f2199efc6c24bae8bf" exitCode=0 Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.653221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" event={"ID":"fda8546a-e13c-4450-9faa-a0e0fcacbfa1","Type":"ContainerDied","Data":"a118ed0666cfac5865b8cf13e2353fb2f762383d409f06f2199efc6c24bae8bf"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.656428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e","Type":"ContainerStarted","Data":"7c365cddf6ed84bd11cda5bab39f7011cec6446718d323f3b9a03cc6c9a4e332"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.656479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e","Type":"ContainerStarted","Data":"3beab98ab337d8703e5de3c283e6e4925ef3fcd6d417e39e01a4159c589950bc"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.657917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerStarted","Data":"65ead58be9a2f688499f325c400cbca88a9585d3230377cc8067dc5888472a5f"} Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.693013 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5949w" podStartSLOduration=10.692991512999999 podStartE2EDuration="10.692991513s" podCreationTimestamp="2025-12-05 11:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:10.689993802 +0000 UTC m=+155.182708545" watchObservedRunningTime="2025-12-05 11:51:10.692991513 +0000 UTC m=+155.185706236" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.698883 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.721619 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.7215951819999997 podStartE2EDuration="2.721595182s" podCreationTimestamp="2025-12-05 11:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:10.720731416 +0000 UTC m=+155.213446149" watchObservedRunningTime="2025-12-05 11:51:10.721595182 +0000 UTC m=+155.214309925" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.749321 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:10 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:10 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:10 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.749384 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.797352 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.798621 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.803809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.839833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.840094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbvn\" (UniqueName: \"kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.840161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.940870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.940916 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbvn\" (UniqueName: \"kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.940977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.941644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.941691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.946294 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:51:10 crc kubenswrapper[4763]: I1205 11:51:10.966746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbvn\" (UniqueName: \"kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn\") pod \"redhat-marketplace-fc2x5\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.132174 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.338776 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:51:11 crc kubenswrapper[4763]: W1205 11:51:11.416742 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36cff6b_557a_4427_af48_e473b13bf117.slice/crio-707abf1d85aef326fd0bc88a5757c8bbecb7d53f97b6c8fd6619242f1d2ca5dc WatchSource:0}: Error finding container 707abf1d85aef326fd0bc88a5757c8bbecb7d53f97b6c8fd6619242f1d2ca5dc: Status 404 returned error can't find the container with id 707abf1d85aef326fd0bc88a5757c8bbecb7d53f97b6c8fd6619242f1d2ca5dc Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.665685 4763 generic.go:334] "Generic (PLEG): container finished" podID="8212fb62-9829-4198-8833-0695b17d2a5d" containerID="4e7766a3ae8df5050ed2a9ed9b388be4b5a102c2ca45b31fb88789da72401771" exitCode=0 Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.665806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerDied","Data":"4e7766a3ae8df5050ed2a9ed9b388be4b5a102c2ca45b31fb88789da72401771"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.668430 4763 generic.go:334] "Generic (PLEG): container finished" podID="d36cff6b-557a-4427-af48-e473b13bf117" containerID="3e967b26069599d410992cf3fdcee157876146103f9c31b5ea0b4bf37e239190" exitCode=0 Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.668497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerDied","Data":"3e967b26069599d410992cf3fdcee157876146103f9c31b5ea0b4bf37e239190"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.668528 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerStarted","Data":"707abf1d85aef326fd0bc88a5757c8bbecb7d53f97b6c8fd6619242f1d2ca5dc"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.674561 4763 generic.go:334] "Generic (PLEG): container finished" podID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerID="40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76" exitCode=0 Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.674629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerDied","Data":"40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.674682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerStarted","Data":"8248093c0c8ae667ce0a768e789dc96f9cf9a0dbdc72a0dea8e302f97804dc7e"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.679242 4763 generic.go:334] "Generic (PLEG): container finished" podID="b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" containerID="7c365cddf6ed84bd11cda5bab39f7011cec6446718d323f3b9a03cc6c9a4e332" exitCode=0 Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.679436 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e","Type":"ContainerDied","Data":"7c365cddf6ed84bd11cda5bab39f7011cec6446718d323f3b9a03cc6c9a4e332"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.683334 4763 generic.go:334] "Generic (PLEG): container finished" podID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerID="4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f" exitCode=0 Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.683414 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerDied","Data":"4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.691725 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" event={"ID":"aa31a254-af8d-4f9f-b22d-1844d7d60382","Type":"ContainerStarted","Data":"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017"} Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.744926 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:11 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:11 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:11 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.744977 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.778430 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.779503 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.781126 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" podStartSLOduration=136.781117123 podStartE2EDuration="2m16.781117123s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:11.780301027 +0000 UTC m=+156.273015750" watchObservedRunningTime="2025-12-05 11:51:11.781117123 +0000 UTC m=+156.273831846" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.784161 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.822544 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.823046 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.854384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.854469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ll8h\" (UniqueName: \"kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.854509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.949932 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.955332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ll8h\" (UniqueName: \"kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.955383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.955437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.955953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.956313 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:11 crc kubenswrapper[4763]: I1205 11:51:11.981111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ll8h\" (UniqueName: \"kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h\") pod \"redhat-operators-nzfbn\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.056221 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume\") pod \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.056637 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2sn4\" (UniqueName: \"kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4\") pod \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.056720 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume\") pod \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\" (UID: \"fda8546a-e13c-4450-9faa-a0e0fcacbfa1\") " Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.056994 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume" (OuterVolumeSpecName: "config-volume") pod "fda8546a-e13c-4450-9faa-a0e0fcacbfa1" (UID: "fda8546a-e13c-4450-9faa-a0e0fcacbfa1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.075044 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4" (OuterVolumeSpecName: "kube-api-access-x2sn4") pod "fda8546a-e13c-4450-9faa-a0e0fcacbfa1" (UID: "fda8546a-e13c-4450-9faa-a0e0fcacbfa1"). InnerVolumeSpecName "kube-api-access-x2sn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.076203 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fda8546a-e13c-4450-9faa-a0e0fcacbfa1" (UID: "fda8546a-e13c-4450-9faa-a0e0fcacbfa1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.120632 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.158483 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.158559 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.158575 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2sn4\" (UniqueName: \"kubernetes.io/projected/fda8546a-e13c-4450-9faa-a0e0fcacbfa1-kube-api-access-x2sn4\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.179520 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:51:12 crc kubenswrapper[4763]: E1205 11:51:12.180079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda8546a-e13c-4450-9faa-a0e0fcacbfa1" containerName="collect-profiles" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.180176 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda8546a-e13c-4450-9faa-a0e0fcacbfa1" containerName="collect-profiles" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.180381 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda8546a-e13c-4450-9faa-a0e0fcacbfa1" containerName="collect-profiles" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.183458 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.193093 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.259484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.259587 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.259619 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpq8\" (UniqueName: \"kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.361160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.361257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.361290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpq8\" (UniqueName: \"kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.362377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.363921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.379699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpq8\" (UniqueName: \"kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8\") pod \"redhat-operators-w42n2\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.506740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.567020 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:51:12 crc kubenswrapper[4763]: W1205 11:51:12.583187 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61dde3bf_99ba_4d4f_bbbd_91ea145ac314.slice/crio-09c901db8d4892fbd4289922c0d1afca932f44651ee1f84b9ba9ded3883af877 WatchSource:0}: Error finding container 09c901db8d4892fbd4289922c0d1afca932f44651ee1f84b9ba9ded3883af877: Status 404 returned error can't find the container with id 09c901db8d4892fbd4289922c0d1afca932f44651ee1f84b9ba9ded3883af877 Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.702462 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" event={"ID":"fda8546a-e13c-4450-9faa-a0e0fcacbfa1","Type":"ContainerDied","Data":"f5c31bb1f6a3971be371d88c4e3041e807ef63379876d0c0d5119dba720a9986"} Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.702754 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c31bb1f6a3971be371d88c4e3041e807ef63379876d0c0d5119dba720a9986" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.702704 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.705833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerStarted","Data":"09c901db8d4892fbd4289922c0d1afca932f44651ee1f84b9ba9ded3883af877"} Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.705875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.740939 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.742249 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:12 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:12 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:12 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.742295 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.747480 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-78sc9" Dec 05 11:51:12 crc kubenswrapper[4763]: I1205 11:51:12.971967 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:51:12 crc kubenswrapper[4763]: W1205 11:51:12.998810 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb19121e_789d_4b3b_9e1d_cba90e11f918.slice/crio-e1c364f24f37ccdc5a431fd72892b8ee4a7c83e492f59a47117adeda6976ab81 WatchSource:0}: Error finding container e1c364f24f37ccdc5a431fd72892b8ee4a7c83e492f59a47117adeda6976ab81: Status 404 returned error can't find the container with id e1c364f24f37ccdc5a431fd72892b8ee4a7c83e492f59a47117adeda6976ab81 Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.003792 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.003827 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.017104 4763 patch_prober.go:28] interesting pod/console-f9d7485db-rsm7h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.020099 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rsm7h" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.134053 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.173103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir\") pod \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.173255 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access\") pod \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\" (UID: \"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e\") " Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.173881 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" (UID: "b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.175641 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.179497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" (UID: "b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.277127 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.656246 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-hv45j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.656647 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hv45j" podUID="a5c4e5af-4c88-4770-b4a7-3eeded875431" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.656588 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-hv45j container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.657096 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hv45j" podUID="a5c4e5af-4c88-4770-b4a7-3eeded875431" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.728819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e","Type":"ContainerDied","Data":"3beab98ab337d8703e5de3c283e6e4925ef3fcd6d417e39e01a4159c589950bc"} Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.728863 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3beab98ab337d8703e5de3c283e6e4925ef3fcd6d417e39e01a4159c589950bc" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.728922 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.739092 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.742044 4763 generic.go:334] "Generic (PLEG): container finished" podID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerID="55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821" exitCode=0 Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.742257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerDied","Data":"55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821"} Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.747184 4763 generic.go:334] "Generic (PLEG): container finished" podID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerID="b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38" exitCode=0 Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.747281 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerDied","Data":"b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38"} Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.747335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerStarted","Data":"e1c364f24f37ccdc5a431fd72892b8ee4a7c83e492f59a47117adeda6976ab81"} Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.749850 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:13 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:13 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:13 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:13 crc kubenswrapper[4763]: I1205 11:51:13.749910 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:14 crc kubenswrapper[4763]: I1205 11:51:14.646669 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 11:51:14 crc kubenswrapper[4763]: I1205 11:51:14.741655 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:14 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:14 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:14 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:14 crc kubenswrapper[4763]: I1205 11:51:14.741712 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.234680 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 11:51:15 crc kubenswrapper[4763]: E1205 11:51:15.234900 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" containerName="pruner" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.234912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" containerName="pruner" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.235195 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2477207-9a8d-489a-b3ed-e8d3e4ab3f6e" containerName="pruner" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.238040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.243116 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.243664 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.248301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.319406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.319471 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.425709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.425805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.426115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.460546 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.564108 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.743216 4763 patch_prober.go:28] interesting pod/router-default-5444994796-f52fv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 11:51:15 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 05 11:51:15 crc kubenswrapper[4763]: [+]process-running ok Dec 05 11:51:15 crc kubenswrapper[4763]: healthz check failed Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.743271 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f52fv" podUID="cea9ce07-369d-4ea5-a2ae-c77eeeaef7da" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 11:51:15 crc kubenswrapper[4763]: I1205 11:51:15.846226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-knjp7" Dec 05 11:51:16 crc kubenswrapper[4763]: I1205 11:51:16.071429 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 11:51:16 crc kubenswrapper[4763]: W1205 11:51:16.112771 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod238ac7b7_a461_4d43_96dd_fc4762e62c6e.slice/crio-ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491 WatchSource:0}: Error finding container ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491: Status 404 returned error can't find the container with id ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491 Dec 05 11:51:16 crc kubenswrapper[4763]: I1205 11:51:16.796020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"238ac7b7-a461-4d43-96dd-fc4762e62c6e","Type":"ContainerStarted","Data":"ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491"} Dec 05 11:51:17 crc kubenswrapper[4763]: I1205 11:51:17.101322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:17 crc kubenswrapper[4763]: I1205 11:51:17.104430 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f52fv" Dec 05 11:51:17 crc kubenswrapper[4763]: I1205 11:51:17.815537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"238ac7b7-a461-4d43-96dd-fc4762e62c6e","Type":"ContainerStarted","Data":"636384d120a118e2fc1f352b507624f5338c193072afbf79c180be87716f6dcc"} Dec 05 11:51:17 crc kubenswrapper[4763]: I1205 11:51:17.825917 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.825896766 podStartE2EDuration="2.825896766s" podCreationTimestamp="2025-12-05 11:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:51:17.821073163 +0000 UTC m=+162.313787886" watchObservedRunningTime="2025-12-05 11:51:17.825896766 +0000 UTC m=+162.318611489" Dec 05 11:51:18 crc kubenswrapper[4763]: I1205 11:51:18.184533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:51:18 crc kubenswrapper[4763]: I1205 11:51:18.193998 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a135c32b-38e4-43f6-bbb1-d1b8e42156ab-metrics-certs\") pod \"network-metrics-daemon-x45qv\" (UID: \"a135c32b-38e4-43f6-bbb1-d1b8e42156ab\") " pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:51:18 crc kubenswrapper[4763]: I1205 11:51:18.303040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x45qv" Dec 05 11:51:18 crc kubenswrapper[4763]: I1205 11:51:18.821480 4763 generic.go:334] "Generic (PLEG): container finished" podID="238ac7b7-a461-4d43-96dd-fc4762e62c6e" containerID="636384d120a118e2fc1f352b507624f5338c193072afbf79c180be87716f6dcc" exitCode=0 Dec 05 11:51:18 crc kubenswrapper[4763]: I1205 11:51:18.821532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"238ac7b7-a461-4d43-96dd-fc4762e62c6e","Type":"ContainerDied","Data":"636384d120a118e2fc1f352b507624f5338c193072afbf79c180be87716f6dcc"} Dec 05 11:51:23 crc kubenswrapper[4763]: I1205 11:51:23.086030 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:23 crc kubenswrapper[4763]: I1205 11:51:23.089976 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 11:51:23 crc kubenswrapper[4763]: I1205 11:51:23.661175 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hv45j" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.787788 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.863185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"238ac7b7-a461-4d43-96dd-fc4762e62c6e","Type":"ContainerDied","Data":"ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491"} Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.863218 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.863234 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb4f9bbff4fc9fb29f6dd8714bccc7b7f14a9d66455e2fca01bbef2a97de491" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.892216 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir\") pod \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.892289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access\") pod \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\" (UID: \"238ac7b7-a461-4d43-96dd-fc4762e62c6e\") " Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.892297 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "238ac7b7-a461-4d43-96dd-fc4762e62c6e" (UID: "238ac7b7-a461-4d43-96dd-fc4762e62c6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.892584 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.946190 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "238ac7b7-a461-4d43-96dd-fc4762e62c6e" (UID: "238ac7b7-a461-4d43-96dd-fc4762e62c6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:51:25 crc kubenswrapper[4763]: I1205 11:51:25.993267 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/238ac7b7-a461-4d43-96dd-fc4762e62c6e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:51:30 crc kubenswrapper[4763]: I1205 11:51:30.226823 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:51:37 crc kubenswrapper[4763]: I1205 11:51:37.544020 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:51:37 crc kubenswrapper[4763]: I1205 11:51:37.544320 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:51:43 crc kubenswrapper[4763]: I1205 11:51:43.758828 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vzzqs" Dec 05 11:51:46 crc kubenswrapper[4763]: I1205 11:51:46.003614 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 11:51:51 crc kubenswrapper[4763]: E1205 11:51:51.049400 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 11:51:51 crc kubenswrapper[4763]: E1205 11:51:51.050241 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ll8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nzfbn_openshift-marketplace(61dde3bf-99ba-4d4f-bbbd-91ea145ac314): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:51:51 crc kubenswrapper[4763]: E1205 11:51:51.051993 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nzfbn" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.233960 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 11:51:51 crc kubenswrapper[4763]: E1205 11:51:51.234178 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ac7b7-a461-4d43-96dd-fc4762e62c6e" containerName="pruner" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.234190 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ac7b7-a461-4d43-96dd-fc4762e62c6e" containerName="pruner" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.234295 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="238ac7b7-a461-4d43-96dd-fc4762e62c6e" containerName="pruner" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.234645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.240845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.241070 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.246362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.344823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.344976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.445976 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.446245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.446345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.482709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:51 crc kubenswrapper[4763]: I1205 11:51:51.570199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:51:53 crc kubenswrapper[4763]: E1205 11:51:53.580270 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nzfbn" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" Dec 05 11:51:55 crc kubenswrapper[4763]: E1205 11:51:55.053694 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 11:51:55 crc kubenswrapper[4763]: E1205 11:51:55.053859 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hbc58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2hjj6_openshift-marketplace(8212fb62-9829-4198-8833-0695b17d2a5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:51:55 crc kubenswrapper[4763]: E1205 11:51:55.055009 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2hjj6" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.039820 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.041280 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.046879 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.101726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.101847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.101873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.203248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.203302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.203324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.203441 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:56 crc kubenswrapper[4763]: I1205 11:51:56.203465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:58 crc kubenswrapper[4763]: I1205 11:51:58.308877 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access\") pod \"installer-9-crc\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:58 crc kubenswrapper[4763]: I1205 11:51:58.465058 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:51:58 crc kubenswrapper[4763]: E1205 11:51:58.589843 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2hjj6" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" Dec 05 11:51:59 crc kubenswrapper[4763]: E1205 11:51:59.278203 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 11:51:59 crc kubenswrapper[4763]: E1205 11:51:59.278607 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5m52p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tqrj4_openshift-marketplace(bb34ace9-e703-4f05-aa04-eebc1d97e096): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:51:59 crc kubenswrapper[4763]: E1205 11:51:59.279830 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tqrj4" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.516928 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tqrj4" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.572149 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.572661 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rfbvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fc2x5_openshift-marketplace(d36cff6b-557a-4427-af48-e473b13bf117): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.574588 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fc2x5" podUID="d36cff6b-557a-4427-af48-e473b13bf117" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.675501 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.676110 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwpq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w42n2_openshift-marketplace(db19121e-789d-4b3b-9e1d-cba90e11f918): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.677591 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-w42n2" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.771223 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.771362 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hh56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rtp7g_openshift-marketplace(84b15a6f-ad11-4681-be03-86c7a7f84320): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.772373 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.772681 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwgtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lpk7r_openshift-marketplace(e37aacbd-2b65-4fe9-9874-38a7c585a300): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.773422 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rtp7g" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.774274 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lpk7r" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.832841 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.833001 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws9t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lzlbl_openshift-marketplace(ea6370e4-7a14-43a3-8ab0-c966df3c3e74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 11:52:00 crc kubenswrapper[4763]: E1205 11:52:00.834175 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lzlbl" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" Dec 05 11:52:00 crc kubenswrapper[4763]: I1205 11:52:00.975330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 11:52:00 crc kubenswrapper[4763]: W1205 11:52:00.988622 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod193ee10a_6394_4fb4_bffa_8155d1532b29.slice/crio-d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3 WatchSource:0}: Error finding container d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3: Status 404 returned error can't find the container with id d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3 Dec 05 11:52:01 crc kubenswrapper[4763]: I1205 11:52:01.041516 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x45qv"] Dec 05 11:52:01 crc kubenswrapper[4763]: W1205 11:52:01.056094 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda135c32b_38e4_43f6_bbb1_d1b8e42156ab.slice/crio-d2e29d7610dfba106f33cbab1c300aa4b7a5ea896daaf96a9b02c44c08f4c955 WatchSource:0}: Error finding container d2e29d7610dfba106f33cbab1c300aa4b7a5ea896daaf96a9b02c44c08f4c955: Status 404 returned error can't find the container with id d2e29d7610dfba106f33cbab1c300aa4b7a5ea896daaf96a9b02c44c08f4c955 Dec 05 11:52:01 crc kubenswrapper[4763]: I1205 11:52:01.058598 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 11:52:01 crc kubenswrapper[4763]: I1205 11:52:01.077883 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"193ee10a-6394-4fb4-bffa-8155d1532b29","Type":"ContainerStarted","Data":"d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3"} Dec 05 11:52:01 crc kubenswrapper[4763]: I1205 11:52:01.080384 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x45qv" event={"ID":"a135c32b-38e4-43f6-bbb1-d1b8e42156ab","Type":"ContainerStarted","Data":"d2e29d7610dfba106f33cbab1c300aa4b7a5ea896daaf96a9b02c44c08f4c955"} Dec 05 11:52:01 crc kubenswrapper[4763]: E1205 11:52:01.090218 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lzlbl" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" Dec 05 11:52:01 crc kubenswrapper[4763]: E1205 11:52:01.090353 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rtp7g" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" Dec 05 11:52:01 crc kubenswrapper[4763]: E1205 11:52:01.090402 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fc2x5" podUID="d36cff6b-557a-4427-af48-e473b13bf117" Dec 05 11:52:01 crc kubenswrapper[4763]: E1205 11:52:01.090468 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w42n2" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" Dec 05 11:52:01 crc kubenswrapper[4763]: E1205 11:52:01.090541 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lpk7r" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.096509 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x45qv" event={"ID":"a135c32b-38e4-43f6-bbb1-d1b8e42156ab","Type":"ContainerStarted","Data":"e1a840da52b60b6480b9eafc2258c377e74c30c1ac8ae63f72837c84c19873c7"} Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.096966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x45qv" event={"ID":"a135c32b-38e4-43f6-bbb1-d1b8e42156ab","Type":"ContainerStarted","Data":"647a601f0dcfcd17a4ee32574e78a6d2e6f2e41adec6f95ed1d7a6c29a7b4c81"} Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.099262 4763 generic.go:334] "Generic (PLEG): container finished" podID="193ee10a-6394-4fb4-bffa-8155d1532b29" containerID="a0c5f9bc9c63882a9c2d0051ace1116a2f055517f3d9e965b329ffc177979f12" exitCode=0 Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.099360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"193ee10a-6394-4fb4-bffa-8155d1532b29","Type":"ContainerDied","Data":"a0c5f9bc9c63882a9c2d0051ace1116a2f055517f3d9e965b329ffc177979f12"} Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.100610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8","Type":"ContainerStarted","Data":"d20b5574026bfbb2937a03b75e6040961b5a51407446636183509686abd8adcf"} Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.100651 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8","Type":"ContainerStarted","Data":"2458a502e0317d9c2ed927f2a2644211b12ba5c40132072aa3ed38989d91d6c4"} Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.131580 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x45qv" podStartSLOduration=187.131559485 podStartE2EDuration="3m7.131559485s" podCreationTimestamp="2025-12-05 11:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:52:02.112213554 +0000 UTC m=+206.604928277" watchObservedRunningTime="2025-12-05 11:52:02.131559485 +0000 UTC m=+206.624274228" Dec 05 11:52:02 crc kubenswrapper[4763]: I1205 11:52:02.148475 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.148452326 podStartE2EDuration="6.148452326s" podCreationTimestamp="2025-12-05 11:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:52:02.143841299 +0000 UTC m=+206.636556022" watchObservedRunningTime="2025-12-05 11:52:02.148452326 +0000 UTC m=+206.641167049" Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.302403 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.504423 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir\") pod \"193ee10a-6394-4fb4-bffa-8155d1532b29\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.504487 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access\") pod \"193ee10a-6394-4fb4-bffa-8155d1532b29\" (UID: \"193ee10a-6394-4fb4-bffa-8155d1532b29\") " Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.504555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "193ee10a-6394-4fb4-bffa-8155d1532b29" (UID: "193ee10a-6394-4fb4-bffa-8155d1532b29"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.505127 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/193ee10a-6394-4fb4-bffa-8155d1532b29-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.511381 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "193ee10a-6394-4fb4-bffa-8155d1532b29" (UID: "193ee10a-6394-4fb4-bffa-8155d1532b29"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:03 crc kubenswrapper[4763]: I1205 11:52:03.607378 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/193ee10a-6394-4fb4-bffa-8155d1532b29-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:04 crc kubenswrapper[4763]: I1205 11:52:04.111669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"193ee10a-6394-4fb4-bffa-8155d1532b29","Type":"ContainerDied","Data":"d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3"} Dec 05 11:52:04 crc kubenswrapper[4763]: I1205 11:52:04.111724 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53bf0bbccdd095fb7d6b7c2d79dec64eb2348f0a3459c0b89d29c3dd65ec3f3" Dec 05 11:52:04 crc kubenswrapper[4763]: I1205 11:52:04.111871 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 11:52:07 crc kubenswrapper[4763]: I1205 11:52:07.544669 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:52:07 crc kubenswrapper[4763]: I1205 11:52:07.546967 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:52:07 crc kubenswrapper[4763]: I1205 11:52:07.547041 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:52:07 crc kubenswrapper[4763]: I1205 11:52:07.547597 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 11:52:07 crc kubenswrapper[4763]: I1205 11:52:07.547706 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223" gracePeriod=600 Dec 05 11:52:08 crc kubenswrapper[4763]: I1205 11:52:08.139446 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223" exitCode=0 Dec 05 11:52:08 crc kubenswrapper[4763]: I1205 11:52:08.139510 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223"} Dec 05 11:52:08 crc kubenswrapper[4763]: I1205 11:52:08.140108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6"} Dec 05 11:52:09 crc kubenswrapper[4763]: I1205 11:52:09.153117 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerStarted","Data":"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413"} Dec 05 11:52:10 crc kubenswrapper[4763]: I1205 11:52:10.160180 4763 generic.go:334] "Generic (PLEG): container finished" podID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerID="5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413" exitCode=0 Dec 05 11:52:10 crc kubenswrapper[4763]: I1205 11:52:10.160257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerDied","Data":"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413"} Dec 05 11:52:12 crc kubenswrapper[4763]: I1205 11:52:12.171440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerStarted","Data":"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88"} Dec 05 11:52:12 crc kubenswrapper[4763]: I1205 11:52:12.193526 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzfbn" podStartSLOduration=3.784782153 podStartE2EDuration="1m1.193506106s" podCreationTimestamp="2025-12-05 11:51:11 +0000 UTC" firstStartedPulling="2025-12-05 11:51:13.747408002 +0000 UTC m=+158.240122725" lastFinishedPulling="2025-12-05 11:52:11.156131955 +0000 UTC m=+215.648846678" observedRunningTime="2025-12-05 11:52:12.193014326 +0000 UTC m=+216.685729049" watchObservedRunningTime="2025-12-05 11:52:12.193506106 +0000 UTC m=+216.686220829" Dec 05 11:52:12 crc kubenswrapper[4763]: I1205 11:52:12.984706 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wndr8"] Dec 05 11:52:13 crc kubenswrapper[4763]: I1205 11:52:13.179025 4763 generic.go:334] "Generic (PLEG): container finished" podID="8212fb62-9829-4198-8833-0695b17d2a5d" containerID="7530132cc0caf3b7238e3396b5fda6df54210400b5e8f4f064767d3167e4b3de" exitCode=0 Dec 05 11:52:13 crc kubenswrapper[4763]: I1205 11:52:13.179100 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerDied","Data":"7530132cc0caf3b7238e3396b5fda6df54210400b5e8f4f064767d3167e4b3de"} Dec 05 11:52:13 crc kubenswrapper[4763]: I1205 11:52:13.182542 4763 generic.go:334] "Generic (PLEG): container finished" podID="d36cff6b-557a-4427-af48-e473b13bf117" containerID="72efdd4dc1f909c188d0828810732db173a7897b615a01fd3c033b4bca564163" exitCode=0 Dec 05 11:52:13 crc kubenswrapper[4763]: I1205 11:52:13.182572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerDied","Data":"72efdd4dc1f909c188d0828810732db173a7897b615a01fd3c033b4bca564163"} Dec 05 11:52:14 crc kubenswrapper[4763]: I1205 11:52:14.205688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerStarted","Data":"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19"} Dec 05 11:52:14 crc kubenswrapper[4763]: I1205 11:52:14.209254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerStarted","Data":"13259befda4ae317e2538f5cc4d9705367f9587a20e87f2125fece26b68ba2ce"} Dec 05 11:52:14 crc kubenswrapper[4763]: I1205 11:52:14.213923 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerStarted","Data":"d14b2ba44bf02679885b2501b0b3047180d75616a7a2b3731e58406c1ee3bfe1"} Dec 05 11:52:14 crc kubenswrapper[4763]: I1205 11:52:14.258711 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2hjj6" podStartSLOduration=4.277248836 podStartE2EDuration="1m6.258690462s" podCreationTimestamp="2025-12-05 11:51:08 +0000 UTC" firstStartedPulling="2025-12-05 11:51:11.66727363 +0000 UTC m=+156.159988353" lastFinishedPulling="2025-12-05 11:52:13.648715246 +0000 UTC m=+218.141429979" observedRunningTime="2025-12-05 11:52:14.255874484 +0000 UTC m=+218.748589207" watchObservedRunningTime="2025-12-05 11:52:14.258690462 +0000 UTC m=+218.751405185" Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.222142 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerID="092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19" exitCode=0 Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.222252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerDied","Data":"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19"} Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.224344 4763 generic.go:334] "Generic (PLEG): container finished" podID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerID="25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28" exitCode=0 Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.224402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerDied","Data":"25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28"} Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.230103 4763 generic.go:334] "Generic (PLEG): container finished" podID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerID="79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc" exitCode=0 Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.230151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerDied","Data":"79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc"} Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.232826 4763 generic.go:334] "Generic (PLEG): container finished" podID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerID="5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee" exitCode=0 Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.232872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerDied","Data":"5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee"} Dec 05 11:52:15 crc kubenswrapper[4763]: I1205 11:52:15.243178 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fc2x5" podStartSLOduration=3.280675504 podStartE2EDuration="1m5.243155989s" podCreationTimestamp="2025-12-05 11:51:10 +0000 UTC" firstStartedPulling="2025-12-05 11:51:11.669921968 +0000 UTC m=+156.162636691" lastFinishedPulling="2025-12-05 11:52:13.632402453 +0000 UTC m=+218.125117176" observedRunningTime="2025-12-05 11:52:14.285861289 +0000 UTC m=+218.778576012" watchObservedRunningTime="2025-12-05 11:52:15.243155989 +0000 UTC m=+219.735870712" Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.243621 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerStarted","Data":"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902"} Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.245996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerStarted","Data":"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81"} Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.248167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerStarted","Data":"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a"} Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.249997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerStarted","Data":"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710"} Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.271509 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tqrj4" podStartSLOduration=3.320368614 podStartE2EDuration="1m8.271483084s" podCreationTimestamp="2025-12-05 11:51:08 +0000 UTC" firstStartedPulling="2025-12-05 11:51:10.651814866 +0000 UTC m=+155.144529589" lastFinishedPulling="2025-12-05 11:52:15.602929336 +0000 UTC m=+220.095644059" observedRunningTime="2025-12-05 11:52:16.268108285 +0000 UTC m=+220.760823008" watchObservedRunningTime="2025-12-05 11:52:16.271483084 +0000 UTC m=+220.764197807" Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.293455 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lpk7r" podStartSLOduration=3.263971577 podStartE2EDuration="1m8.293434463s" podCreationTimestamp="2025-12-05 11:51:08 +0000 UTC" firstStartedPulling="2025-12-05 11:51:10.644668566 +0000 UTC m=+155.137383289" lastFinishedPulling="2025-12-05 11:52:15.674131452 +0000 UTC m=+220.166846175" observedRunningTime="2025-12-05 11:52:16.292433314 +0000 UTC m=+220.785148047" watchObservedRunningTime="2025-12-05 11:52:16.293434463 +0000 UTC m=+220.786149176" Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.332906 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w42n2" podStartSLOduration=2.323879019 podStartE2EDuration="1m4.332878954s" podCreationTimestamp="2025-12-05 11:51:12 +0000 UTC" firstStartedPulling="2025-12-05 11:51:13.756235853 +0000 UTC m=+158.248950576" lastFinishedPulling="2025-12-05 11:52:15.765235788 +0000 UTC m=+220.257950511" observedRunningTime="2025-12-05 11:52:16.316684292 +0000 UTC m=+220.809399025" watchObservedRunningTime="2025-12-05 11:52:16.332878954 +0000 UTC m=+220.825593677" Dec 05 11:52:16 crc kubenswrapper[4763]: I1205 11:52:16.350794 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzlbl" podStartSLOduration=4.406187341 podStartE2EDuration="1m8.350773426s" podCreationTimestamp="2025-12-05 11:51:08 +0000 UTC" firstStartedPulling="2025-12-05 11:51:11.689689626 +0000 UTC m=+156.182404349" lastFinishedPulling="2025-12-05 11:52:15.634275711 +0000 UTC m=+220.126990434" observedRunningTime="2025-12-05 11:52:16.345282618 +0000 UTC m=+220.837997341" watchObservedRunningTime="2025-12-05 11:52:16.350773426 +0000 UTC m=+220.843488159" Dec 05 11:52:17 crc kubenswrapper[4763]: I1205 11:52:17.256186 4763 generic.go:334] "Generic (PLEG): container finished" podID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerID="deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6" exitCode=0 Dec 05 11:52:17 crc kubenswrapper[4763]: I1205 11:52:17.256268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerDied","Data":"deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6"} Dec 05 11:52:18 crc kubenswrapper[4763]: I1205 11:52:18.263881 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerStarted","Data":"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457"} Dec 05 11:52:18 crc kubenswrapper[4763]: I1205 11:52:18.298986 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rtp7g" podStartSLOduration=2.173157937 podStartE2EDuration="1m8.298966999s" podCreationTimestamp="2025-12-05 11:51:10 +0000 UTC" firstStartedPulling="2025-12-05 11:51:11.676810366 +0000 UTC m=+156.169525089" lastFinishedPulling="2025-12-05 11:52:17.802619428 +0000 UTC m=+222.295334151" observedRunningTime="2025-12-05 11:52:18.29783987 +0000 UTC m=+222.790554593" watchObservedRunningTime="2025-12-05 11:52:18.298966999 +0000 UTC m=+222.791681722" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.006539 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.006586 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.069003 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.334948 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.334995 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.368817 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.515842 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.515875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.515885 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.515899 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.556613 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:19 crc kubenswrapper[4763]: I1205 11:52:19.570419 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.316473 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.320465 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.330792 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.699352 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.699707 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:52:20 crc kubenswrapper[4763]: I1205 11:52:20.734730 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:52:21 crc kubenswrapper[4763]: I1205 11:52:21.132715 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:21 crc kubenswrapper[4763]: I1205 11:52:21.132932 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:21 crc kubenswrapper[4763]: I1205 11:52:21.174103 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:21 crc kubenswrapper[4763]: I1205 11:52:21.321637 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.014022 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.121834 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.121899 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.171268 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.213872 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.281810 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2hjj6" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="registry-server" containerID="cri-o://d14b2ba44bf02679885b2501b0b3047180d75616a7a2b3731e58406c1ee3bfe1" gracePeriod=2 Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.319688 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.507127 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.507196 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:22 crc kubenswrapper[4763]: I1205 11:52:22.553437 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:23 crc kubenswrapper[4763]: I1205 11:52:23.285594 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fc2x5" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="registry-server" containerID="cri-o://13259befda4ae317e2538f5cc4d9705367f9587a20e87f2125fece26b68ba2ce" gracePeriod=2 Dec 05 11:52:23 crc kubenswrapper[4763]: I1205 11:52:23.333020 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:24 crc kubenswrapper[4763]: I1205 11:52:24.416684 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:52:24 crc kubenswrapper[4763]: I1205 11:52:24.417260 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tqrj4" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="registry-server" containerID="cri-o://98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902" gracePeriod=2 Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.307943 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjj6_8212fb62-9829-4198-8833-0695b17d2a5d/registry-server/0.log" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.308629 4763 generic.go:334] "Generic (PLEG): container finished" podID="8212fb62-9829-4198-8833-0695b17d2a5d" containerID="d14b2ba44bf02679885b2501b0b3047180d75616a7a2b3731e58406c1ee3bfe1" exitCode=137 Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.308698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerDied","Data":"d14b2ba44bf02679885b2501b0b3047180d75616a7a2b3731e58406c1ee3bfe1"} Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.310098 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fc2x5_d36cff6b-557a-4427-af48-e473b13bf117/registry-server/0.log" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.310891 4763 generic.go:334] "Generic (PLEG): container finished" podID="d36cff6b-557a-4427-af48-e473b13bf117" containerID="13259befda4ae317e2538f5cc4d9705367f9587a20e87f2125fece26b68ba2ce" exitCode=137 Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.310943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerDied","Data":"13259befda4ae317e2538f5cc4d9705367f9587a20e87f2125fece26b68ba2ce"} Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.593495 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.715306 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities\") pod \"bb34ace9-e703-4f05-aa04-eebc1d97e096\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.715349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content\") pod \"bb34ace9-e703-4f05-aa04-eebc1d97e096\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.715422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m52p\" (UniqueName: \"kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p\") pod \"bb34ace9-e703-4f05-aa04-eebc1d97e096\" (UID: \"bb34ace9-e703-4f05-aa04-eebc1d97e096\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.717079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities" (OuterVolumeSpecName: "utilities") pod "bb34ace9-e703-4f05-aa04-eebc1d97e096" (UID: "bb34ace9-e703-4f05-aa04-eebc1d97e096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.717299 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fc2x5_d36cff6b-557a-4427-af48-e473b13bf117/registry-server/0.log" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.718409 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.721485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p" (OuterVolumeSpecName: "kube-api-access-5m52p") pod "bb34ace9-e703-4f05-aa04-eebc1d97e096" (UID: "bb34ace9-e703-4f05-aa04-eebc1d97e096"). InnerVolumeSpecName "kube-api-access-5m52p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.782744 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb34ace9-e703-4f05-aa04-eebc1d97e096" (UID: "bb34ace9-e703-4f05-aa04-eebc1d97e096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.815609 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.815849 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w42n2" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="registry-server" containerID="cri-o://136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710" gracePeriod=2 Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbvn\" (UniqueName: \"kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn\") pod \"d36cff6b-557a-4427-af48-e473b13bf117\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816195 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities\") pod \"d36cff6b-557a-4427-af48-e473b13bf117\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816347 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content\") pod \"d36cff6b-557a-4427-af48-e473b13bf117\" (UID: \"d36cff6b-557a-4427-af48-e473b13bf117\") " Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816638 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816654 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m52p\" (UniqueName: \"kubernetes.io/projected/bb34ace9-e703-4f05-aa04-eebc1d97e096-kube-api-access-5m52p\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816667 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb34ace9-e703-4f05-aa04-eebc1d97e096-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.816946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities" (OuterVolumeSpecName: "utilities") pod "d36cff6b-557a-4427-af48-e473b13bf117" (UID: "d36cff6b-557a-4427-af48-e473b13bf117"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.819928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn" (OuterVolumeSpecName: "kube-api-access-rfbvn") pod "d36cff6b-557a-4427-af48-e473b13bf117" (UID: "d36cff6b-557a-4427-af48-e473b13bf117"). InnerVolumeSpecName "kube-api-access-rfbvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.836550 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36cff6b-557a-4427-af48-e473b13bf117" (UID: "d36cff6b-557a-4427-af48-e473b13bf117"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.917470 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.917506 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfbvn\" (UniqueName: \"kubernetes.io/projected/d36cff6b-557a-4427-af48-e473b13bf117-kube-api-access-rfbvn\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:26 crc kubenswrapper[4763]: I1205 11:52:26.917521 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36cff6b-557a-4427-af48-e473b13bf117-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.318269 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fc2x5_d36cff6b-557a-4427-af48-e473b13bf117/registry-server/0.log" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.319336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fc2x5" event={"ID":"d36cff6b-557a-4427-af48-e473b13bf117","Type":"ContainerDied","Data":"707abf1d85aef326fd0bc88a5757c8bbecb7d53f97b6c8fd6619242f1d2ca5dc"} Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.319367 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fc2x5" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.319545 4763 scope.go:117] "RemoveContainer" containerID="13259befda4ae317e2538f5cc4d9705367f9587a20e87f2125fece26b68ba2ce" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.322876 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerID="98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902" exitCode=0 Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.322933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerDied","Data":"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902"} Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.322977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqrj4" event={"ID":"bb34ace9-e703-4f05-aa04-eebc1d97e096","Type":"ContainerDied","Data":"869b6957bfbd33d3a34e683316227001bea6c14cbb71390e05daac30e716699c"} Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.323009 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqrj4" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.338620 4763 scope.go:117] "RemoveContainer" containerID="72efdd4dc1f909c188d0828810732db173a7897b615a01fd3c033b4bca564163" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.354101 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.356631 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fc2x5"] Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.366734 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.372262 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tqrj4"] Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.375181 4763 scope.go:117] "RemoveContainer" containerID="3e967b26069599d410992cf3fdcee157876146103f9c31b5ea0b4bf37e239190" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.404148 4763 scope.go:117] "RemoveContainer" containerID="98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.415516 4763 scope.go:117] "RemoveContainer" containerID="092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.427648 4763 scope.go:117] "RemoveContainer" containerID="84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.442582 4763 scope.go:117] "RemoveContainer" containerID="98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902" Dec 05 11:52:27 crc kubenswrapper[4763]: E1205 11:52:27.443116 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902\": container with ID starting with 98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902 not found: ID does not exist" containerID="98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.443144 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902"} err="failed to get container status \"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902\": rpc error: code = NotFound desc = could not find container \"98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902\": container with ID starting with 98b5b288457337bed1f25232baa199f45a2784a38117cb96c04f54847d6a2902 not found: ID does not exist" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.443164 4763 scope.go:117] "RemoveContainer" containerID="092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19" Dec 05 11:52:27 crc kubenswrapper[4763]: E1205 11:52:27.443539 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19\": container with ID starting with 092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19 not found: ID does not exist" containerID="092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.443597 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19"} err="failed to get container status \"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19\": rpc error: code = NotFound desc = could not find container \"092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19\": container with ID starting with 092cb6e6bcf15430fbec033d19a03eb69902d62e041e4620194bc5244f2a4c19 not found: ID does not exist" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.443624 4763 scope.go:117] "RemoveContainer" containerID="84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c" Dec 05 11:52:27 crc kubenswrapper[4763]: E1205 11:52:27.444134 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c\": container with ID starting with 84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c not found: ID does not exist" containerID="84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.444180 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c"} err="failed to get container status \"84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c\": rpc error: code = NotFound desc = could not find container \"84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c\": container with ID starting with 84d19caa70e9a0a3d0d884c480db560bea76158425ec17e488322ed922edc35c not found: ID does not exist" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.789053 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" path="/var/lib/kubelet/pods/bb34ace9-e703-4f05-aa04-eebc1d97e096/volumes" Dec 05 11:52:27 crc kubenswrapper[4763]: I1205 11:52:27.789691 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36cff6b-557a-4427-af48-e473b13bf117" path="/var/lib/kubelet/pods/d36cff6b-557a-4427-af48-e473b13bf117/volumes" Dec 05 11:52:28 crc kubenswrapper[4763]: I1205 11:52:28.988680 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjj6_8212fb62-9829-4198-8833-0695b17d2a5d/registry-server/0.log" Dec 05 11:52:28 crc kubenswrapper[4763]: I1205 11:52:28.991777 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.051127 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.143334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities\") pod \"8212fb62-9829-4198-8833-0695b17d2a5d\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.143574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content\") pod \"8212fb62-9829-4198-8833-0695b17d2a5d\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.143690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbc58\" (UniqueName: \"kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58\") pod \"8212fb62-9829-4198-8833-0695b17d2a5d\" (UID: \"8212fb62-9829-4198-8833-0695b17d2a5d\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.144910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities" (OuterVolumeSpecName: "utilities") pod "8212fb62-9829-4198-8833-0695b17d2a5d" (UID: "8212fb62-9829-4198-8833-0695b17d2a5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.151735 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58" (OuterVolumeSpecName: "kube-api-access-hbc58") pod "8212fb62-9829-4198-8833-0695b17d2a5d" (UID: "8212fb62-9829-4198-8833-0695b17d2a5d"). InnerVolumeSpecName "kube-api-access-hbc58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.177246 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.206025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8212fb62-9829-4198-8833-0695b17d2a5d" (UID: "8212fb62-9829-4198-8833-0695b17d2a5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.245920 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.245969 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbc58\" (UniqueName: \"kubernetes.io/projected/8212fb62-9829-4198-8833-0695b17d2a5d-kube-api-access-hbc58\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.245987 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8212fb62-9829-4198-8833-0695b17d2a5d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.337057 4763 generic.go:334] "Generic (PLEG): container finished" podID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerID="136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710" exitCode=0 Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.337116 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w42n2" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.337152 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerDied","Data":"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710"} Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.337189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w42n2" event={"ID":"db19121e-789d-4b3b-9e1d-cba90e11f918","Type":"ContainerDied","Data":"e1c364f24f37ccdc5a431fd72892b8ee4a7c83e492f59a47117adeda6976ab81"} Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.337214 4763 scope.go:117] "RemoveContainer" containerID="136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.338876 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjj6_8212fb62-9829-4198-8833-0695b17d2a5d/registry-server/0.log" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.339689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjj6" event={"ID":"8212fb62-9829-4198-8833-0695b17d2a5d","Type":"ContainerDied","Data":"767663780b9e0dececfe16f243bf558d7aa9be56db4728787b39f90e1253ae68"} Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.339890 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjj6" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.346587 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities\") pod \"db19121e-789d-4b3b-9e1d-cba90e11f918\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.347034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content\") pod \"db19121e-789d-4b3b-9e1d-cba90e11f918\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.347104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwpq8\" (UniqueName: \"kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8\") pod \"db19121e-789d-4b3b-9e1d-cba90e11f918\" (UID: \"db19121e-789d-4b3b-9e1d-cba90e11f918\") " Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.348421 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities" (OuterVolumeSpecName: "utilities") pod "db19121e-789d-4b3b-9e1d-cba90e11f918" (UID: "db19121e-789d-4b3b-9e1d-cba90e11f918"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.349876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8" (OuterVolumeSpecName: "kube-api-access-wwpq8") pod "db19121e-789d-4b3b-9e1d-cba90e11f918" (UID: "db19121e-789d-4b3b-9e1d-cba90e11f918"). InnerVolumeSpecName "kube-api-access-wwpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.360298 4763 scope.go:117] "RemoveContainer" containerID="5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.390741 4763 scope.go:117] "RemoveContainer" containerID="b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.391603 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.399217 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2hjj6"] Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.424395 4763 scope.go:117] "RemoveContainer" containerID="136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710" Dec 05 11:52:29 crc kubenswrapper[4763]: E1205 11:52:29.425867 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710\": container with ID starting with 136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710 not found: ID does not exist" containerID="136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.425911 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710"} err="failed to get container status \"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710\": rpc error: code = NotFound desc = could not find container \"136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710\": container with ID starting with 136c2ed4afbfeb7e4526f74604ea5315076a999e66d855dd5c0a792c6e525710 not found: ID does not exist" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.425938 4763 scope.go:117] "RemoveContainer" containerID="5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee" Dec 05 11:52:29 crc kubenswrapper[4763]: E1205 11:52:29.426539 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee\": container with ID starting with 5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee not found: ID does not exist" containerID="5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.426575 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee"} err="failed to get container status \"5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee\": rpc error: code = NotFound desc = could not find container \"5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee\": container with ID starting with 5b89c4336c5a5db159db1bfc975ff12da1e662e945f4752742e2b58548ee88ee not found: ID does not exist" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.426594 4763 scope.go:117] "RemoveContainer" containerID="b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38" Dec 05 11:52:29 crc kubenswrapper[4763]: E1205 11:52:29.427658 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38\": container with ID starting with b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38 not found: ID does not exist" containerID="b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.427678 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38"} err="failed to get container status \"b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38\": rpc error: code = NotFound desc = could not find container \"b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38\": container with ID starting with b191d235a113341aceb79780fe1cd69ab04853bac52a11afdab6f4bf55123a38 not found: ID does not exist" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.427693 4763 scope.go:117] "RemoveContainer" containerID="d14b2ba44bf02679885b2501b0b3047180d75616a7a2b3731e58406c1ee3bfe1" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.442051 4763 scope.go:117] "RemoveContainer" containerID="7530132cc0caf3b7238e3396b5fda6df54210400b5e8f4f064767d3167e4b3de" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.448390 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwpq8\" (UniqueName: \"kubernetes.io/projected/db19121e-789d-4b3b-9e1d-cba90e11f918-kube-api-access-wwpq8\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.448416 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.453886 4763 scope.go:117] "RemoveContainer" containerID="4e7766a3ae8df5050ed2a9ed9b388be4b5a102c2ca45b31fb88789da72401771" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.558398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db19121e-789d-4b3b-9e1d-cba90e11f918" (UID: "db19121e-789d-4b3b-9e1d-cba90e11f918"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.650081 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db19121e-789d-4b3b-9e1d-cba90e11f918-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.669231 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.672221 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w42n2"] Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.792877 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" path="/var/lib/kubelet/pods/8212fb62-9829-4198-8833-0695b17d2a5d/volumes" Dec 05 11:52:29 crc kubenswrapper[4763]: I1205 11:52:29.793717 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" path="/var/lib/kubelet/pods/db19121e-789d-4b3b-9e1d-cba90e11f918/volumes" Dec 05 11:52:30 crc kubenswrapper[4763]: I1205 11:52:30.741004 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.021936 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerName="oauth-openshift" containerID="cri-o://57a1fda41b6aca3107599caac8502a683367d23210617ad68d83342fc7eddc4c" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.394482 4763 generic.go:334] "Generic (PLEG): container finished" podID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerID="57a1fda41b6aca3107599caac8502a683367d23210617ad68d83342fc7eddc4c" exitCode=0 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.394560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" event={"ID":"b600b871-2ca7-4ca9-ab49-82a77bf73b6a","Type":"ContainerDied","Data":"57a1fda41b6aca3107599caac8502a683367d23210617ad68d83342fc7eddc4c"} Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.394853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" event={"ID":"b600b871-2ca7-4ca9-ab49-82a77bf73b6a","Type":"ContainerDied","Data":"67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85"} Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.394867 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67f5e0700c02ae864ed6b1e010ffa965306e1c989dd799d5c198cb667327ff85" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.401313 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.434746 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66456c6bb-hcc9n"] Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.434999 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435016 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435025 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435052 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435064 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435072 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435080 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435085 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435092 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193ee10a-6394-4fb4-bffa-8155d1532b29" containerName="pruner" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435098 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="193ee10a-6394-4fb4-bffa-8155d1532b29" containerName="pruner" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435105 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435111 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435121 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435127 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435136 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435142 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435153 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435159 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435166 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435172 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435180 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerName="oauth-openshift" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435185 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerName="oauth-openshift" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435192 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435197 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="extract-utilities" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435205 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435211 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="extract-content" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.435218 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435224 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435309 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="193ee10a-6394-4fb4-bffa-8155d1532b29" containerName="pruner" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435321 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" containerName="oauth-openshift" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435330 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="db19121e-789d-4b3b-9e1d-cba90e11f918" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435338 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36cff6b-557a-4427-af48-e473b13bf117" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435346 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb34ace9-e703-4f05-aa04-eebc1d97e096" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435355 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8212fb62-9829-4198-8833-0695b17d2a5d" containerName="registry-server" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.435688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.446809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66456c6bb-hcc9n"] Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.572142 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.572465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg798\" (UniqueName: \"kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573082 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573142 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573180 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573200 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573311 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573343 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573374 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573418 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session\") pod \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\" (UID: \"b600b871-2ca7-4ca9-ab49-82a77bf73b6a\") " Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573609 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573662 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573687 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-session\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-dir\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573818 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-error\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573876 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-service-ca\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573899 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-policies\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-router-certs\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573962 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-login\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.573987 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574007 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5cz\" (UniqueName: \"kubernetes.io/projected/fe555113-6ec1-4ccb-a29c-7a46458bc380-kube-api-access-7h5cz\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574030 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574031 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574085 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.574891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.578611 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.584990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.585307 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.585404 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798" (OuterVolumeSpecName: "kube-api-access-jg798") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "kube-api-access-jg798". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.585855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.586297 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.586488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.586519 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.586616 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b600b871-2ca7-4ca9-ab49-82a77bf73b6a" (UID: "b600b871-2ca7-4ca9-ab49-82a77bf73b6a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-login\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5cz\" (UniqueName: \"kubernetes.io/projected/fe555113-6ec1-4ccb-a29c-7a46458bc380-kube-api-access-7h5cz\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675626 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-session\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.675820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-dir\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676355 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-error\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-dir\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676424 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-service-ca\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-policies\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676479 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-router-certs\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676550 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg798\" (UniqueName: \"kubernetes.io/projected/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-kube-api-access-jg798\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676567 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676581 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676593 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676607 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676619 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676631 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676645 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676658 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676669 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676679 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676691 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.676704 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b600b871-2ca7-4ca9-ab49-82a77bf73b6a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.677099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.677851 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.678237 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-audit-policies\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.678752 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.680230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-service-ca\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.680456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-login\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.681177 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.680962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.681556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-user-template-error\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.681658 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-router-certs\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.681712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-session\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.683555 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe555113-6ec1-4ccb-a29c-7a46458bc380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.692482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5cz\" (UniqueName: \"kubernetes.io/projected/fe555113-6ec1-4ccb-a29c-7a46458bc380-kube-api-access-7h5cz\") pod \"oauth-openshift-66456c6bb-hcc9n\" (UID: \"fe555113-6ec1-4ccb-a29c-7a46458bc380\") " pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.755085 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.975956 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.977997 4763 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978214 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978426 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978501 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978667 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978747 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.978817 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3" gracePeriod=15 Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979621 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979862 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979880 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979895 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979903 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979918 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979926 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979944 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979952 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979967 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979975 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.979986 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.979994 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:38 crc kubenswrapper[4763]: E1205 11:52:38.980012 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980019 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980141 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980162 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980177 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980187 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980195 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:38 crc kubenswrapper[4763]: I1205 11:52:38.980203 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.013955 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66456c6bb-hcc9n"] Dec 05 11:52:39 crc kubenswrapper[4763]: E1205 11:52:39.045558 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: E1205 11:52:39.049454 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-66456c6bb-hcc9n.187e4f8a388a681e openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-66456c6bb-hcc9n,UID:fe555113-6ec1-4ccb-a29c-7a46458bc380,APIVersion:v1,ResourceVersion:29338,FieldPath:spec.containers{oauth-openshift},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 11:52:39.04848899 +0000 UTC m=+243.541203713,LastTimestamp:2025-12-05 11:52:39.04848899 +0000 UTC m=+243.541203713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084701 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.084955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.085099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.187904 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188627 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188622 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188700 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.188791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189380 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.189419 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.346844 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:39 crc kubenswrapper[4763]: W1205 11:52:39.375814 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5812e768205d51cd0aec99ec6ce4c1d9ec1e80456db6c5ac21016f2d37f772b8 WatchSource:0}: Error finding container 5812e768205d51cd0aec99ec6ce4c1d9ec1e80456db6c5ac21016f2d37f772b8: Status 404 returned error can't find the container with id 5812e768205d51cd0aec99ec6ce4c1d9ec1e80456db6c5ac21016f2d37f772b8 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.403735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5812e768205d51cd0aec99ec6ce4c1d9ec1e80456db6c5ac21016f2d37f772b8"} Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.406494 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.409545 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.428294 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4" exitCode=0 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.429407 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c" exitCode=0 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.429485 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7" exitCode=0 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.429551 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3" exitCode=2 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.428442 4763 scope.go:117] "RemoveContainer" containerID="3b8b998e26c6fad2befac974b1f355b51781e8a64549431d3182e5b7f744c6dd" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.438971 4763 generic.go:334] "Generic (PLEG): container finished" podID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" containerID="d20b5574026bfbb2937a03b75e6040961b5a51407446636183509686abd8adcf" exitCode=0 Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.439048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8","Type":"ContainerDied","Data":"d20b5574026bfbb2937a03b75e6040961b5a51407446636183509686abd8adcf"} Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.440728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.440814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerStarted","Data":"0fe425f369ee4a292103b16f42dec404c10d96e46b92e762e821ca01e5df2c75"} Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.440856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerStarted","Data":"6c169fc0f2e6044162e4376d93ba8c27ffbb03791675bc7b97bb246e3919ca69"} Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.441565 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.442892 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.443171 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.443482 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.443791 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.444081 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.444319 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.458200 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.458469 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.458710 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.458961 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.963802 4763 patch_prober.go:28] interesting pod/oauth-openshift-66456c6bb-hcc9n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:50522->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 05 11:52:39 crc kubenswrapper[4763]: I1205 11:52:39.963898 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:50522->10.217.0.56:6443: read: connection reset by peer" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.450329 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/0.log" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.450614 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe555113-6ec1-4ccb-a29c-7a46458bc380" containerID="0fe425f369ee4a292103b16f42dec404c10d96e46b92e762e821ca01e5df2c75" exitCode=255 Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.450677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerDied","Data":"0fe425f369ee4a292103b16f42dec404c10d96e46b92e762e821ca01e5df2c75"} Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.451371 4763 scope.go:117] "RemoveContainer" containerID="0fe425f369ee4a292103b16f42dec404c10d96e46b92e762e821ca01e5df2c75" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.451946 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.452561 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.452926 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.454250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5"} Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.454801 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: E1205 11:52:40.454830 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.455018 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.455299 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.458371 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.727616 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.728568 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.728900 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.729270 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir\") pod \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock\") pod \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" (UID: "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815211 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access\") pod \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\" (UID: \"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8\") " Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock" (OuterVolumeSpecName: "var-lock") pod "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" (UID: "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815418 4763 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.815435 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.821894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" (UID: "d19b7aa7-7fea-483f-93ba-2278dd8c9ee8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:52:40 crc kubenswrapper[4763]: I1205 11:52:40.916354 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d19b7aa7-7fea-483f-93ba-2278dd8c9ee8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.345113 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.346493 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.347124 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.347808 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.348209 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.348569 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.467361 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.468115 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4" exitCode=0 Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.468226 4763 scope.go:117] "RemoveContainer" containerID="dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.468235 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.470368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d19b7aa7-7fea-483f-93ba-2278dd8c9ee8","Type":"ContainerDied","Data":"2458a502e0317d9c2ed927f2a2644211b12ba5c40132072aa3ed38989d91d6c4"} Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.470444 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2458a502e0317d9c2ed927f2a2644211b12ba5c40132072aa3ed38989d91d6c4" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.470399 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.471928 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/1.log" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.472371 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/0.log" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.472406 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe555113-6ec1-4ccb-a29c-7a46458bc380" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" exitCode=255 Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.473679 4763 scope.go:117] "RemoveContainer" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.473981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerDied","Data":"edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547"} Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.473994 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.474083 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.474461 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.474817 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.475187 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.475442 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.487001 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.487309 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.487671 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.487885 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.493724 4763 scope.go:117] "RemoveContainer" containerID="0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.505467 4763 scope.go:117] "RemoveContainer" containerID="c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.521054 4763 scope.go:117] "RemoveContainer" containerID="4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525165 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525205 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525306 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525293 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525325 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525571 4763 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525589 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.525600 4763 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.534239 4763 scope.go:117] "RemoveContainer" containerID="ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.549417 4763 scope.go:117] "RemoveContainer" containerID="7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.563626 4763 scope.go:117] "RemoveContainer" containerID="dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.564063 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\": container with ID starting with dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4 not found: ID does not exist" containerID="dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564116 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4"} err="failed to get container status \"dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\": rpc error: code = NotFound desc = could not find container \"dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4\": container with ID starting with dd8e1ff33f1454598ac75df24a5aa349aa5cc3bcdb144c4fcc1d06d3f71344d4 not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564148 4763 scope.go:117] "RemoveContainer" containerID="0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.564459 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\": container with ID starting with 0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c not found: ID does not exist" containerID="0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564496 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c"} err="failed to get container status \"0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\": rpc error: code = NotFound desc = could not find container \"0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c\": container with ID starting with 0da535b302151bac5f7cbc614fe2b2ab4d5dcfaad63311d2d42f5f16b84c895c not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564523 4763 scope.go:117] "RemoveContainer" containerID="c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.564817 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\": container with ID starting with c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7 not found: ID does not exist" containerID="c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564853 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7"} err="failed to get container status \"c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\": rpc error: code = NotFound desc = could not find container \"c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7\": container with ID starting with c72abf1b628e3940b23fd0f68b012f674a5ee4d46c8ecb338d11f2ae6c2228d7 not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.564874 4763 scope.go:117] "RemoveContainer" containerID="4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.565162 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\": container with ID starting with 4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3 not found: ID does not exist" containerID="4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565191 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3"} err="failed to get container status \"4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\": rpc error: code = NotFound desc = could not find container \"4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3\": container with ID starting with 4ae90365e915e6563135a0463867ada06763bfe73821860248b83cae6c3591e3 not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565207 4763 scope.go:117] "RemoveContainer" containerID="ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.565457 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\": container with ID starting with ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4 not found: ID does not exist" containerID="ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565486 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4"} err="failed to get container status \"ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\": rpc error: code = NotFound desc = could not find container \"ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4\": container with ID starting with ac1f9ede2cc2cde36d62ea4b0701d8a27c8f716efa652674b8cb2774872f53c4 not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565502 4763 scope.go:117] "RemoveContainer" containerID="7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.565735 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\": container with ID starting with 7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec not found: ID does not exist" containerID="7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565786 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec"} err="failed to get container status \"7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\": rpc error: code = NotFound desc = could not find container \"7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec\": container with ID starting with 7b25648bdc0c88da6b547a986a451b705ed3b1ed91da37b45f669728a082caec not found: ID does not exist" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.565807 4763 scope.go:117] "RemoveContainer" containerID="0fe425f369ee4a292103b16f42dec404c10d96e46b92e762e821ca01e5df2c75" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.784979 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.785369 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.786068 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.786717 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:41 crc kubenswrapper[4763]: I1205 11:52:41.792903 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 11:52:41 crc kubenswrapper[4763]: E1205 11:52:41.937338 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-66456c6bb-hcc9n.187e4f8a388a681e openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-66456c6bb-hcc9n,UID:fe555113-6ec1-4ccb-a29c-7a46458bc380,APIVersion:v1,ResourceVersion:29338,FieldPath:spec.containers{oauth-openshift},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 11:52:39.04848899 +0000 UTC m=+243.541203713,LastTimestamp:2025-12-05 11:52:39.04848899 +0000 UTC m=+243.541203713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.024681 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.025581 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.026334 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.027503 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.027899 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.027954 4763 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.028278 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.229064 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.486856 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/1.log" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.487602 4763 scope.go:117] "RemoveContainer" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.487667 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.487894 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: I1205 11:52:42.488217 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.488958 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:52:42 crc kubenswrapper[4763]: E1205 11:52:42.631043 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.381344 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:52:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:52:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:52:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T11:52:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.381802 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.382135 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.382398 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.382679 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.382706 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 11:52:43 crc kubenswrapper[4763]: E1205 11:52:43.432645 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Dec 05 11:52:45 crc kubenswrapper[4763]: E1205 11:52:45.033973 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Dec 05 11:52:45 crc kubenswrapper[4763]: I1205 11:52:45.787120 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:45 crc kubenswrapper[4763]: I1205 11:52:45.787514 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:45 crc kubenswrapper[4763]: I1205 11:52:45.788405 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:48 crc kubenswrapper[4763]: E1205 11:52:48.234930 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="6.4s" Dec 05 11:52:48 crc kubenswrapper[4763]: I1205 11:52:48.755876 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:48 crc kubenswrapper[4763]: I1205 11:52:48.755943 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:52:48 crc kubenswrapper[4763]: I1205 11:52:48.756722 4763 scope.go:117] "RemoveContainer" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" Dec 05 11:52:48 crc kubenswrapper[4763]: E1205 11:52:48.757054 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:52:48 crc kubenswrapper[4763]: E1205 11:52:48.837264 4763 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" volumeName="registry-storage" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.783676 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.784745 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.785230 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.785726 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.797564 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.797598 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:50 crc kubenswrapper[4763]: E1205 11:52:50.797974 4763 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:50 crc kubenswrapper[4763]: I1205 11:52:50.798435 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.535936 4763 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="49e0eb721e5230898ec1729f6de7f345f7f39068c9dd3bf8e982d4a56cc762fe" exitCode=0 Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.536143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"49e0eb721e5230898ec1729f6de7f345f7f39068c9dd3bf8e982d4a56cc762fe"} Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.536314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d6732f0319e62d651afc95643b94faf302b528361972520861d9fd24352e746"} Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.536600 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.536618 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:51 crc kubenswrapper[4763]: E1205 11:52:51.537143 4763 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.537546 4763 status_manager.go:851] "Failed to get status for pod" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" pod="openshift-authentication/oauth-openshift-558db77b4-wndr8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wndr8\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.537944 4763 status_manager.go:851] "Failed to get status for pod" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66456c6bb-hcc9n\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:51 crc kubenswrapper[4763]: I1205 11:52:51.538575 4763 status_manager.go:851] "Failed to get status for pod" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Dec 05 11:52:52 crc kubenswrapper[4763]: I1205 11:52:52.545224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9f8065941d14694bd3d00b72cd5df3ffa33c2b8b68c673dc8014dc1cc287df2"} Dec 05 11:52:52 crc kubenswrapper[4763]: I1205 11:52:52.545265 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f393d8a6fbb654a18dc02061d3f21d11d691c0405dc2d5c5d2c378685879ea00"} Dec 05 11:52:52 crc kubenswrapper[4763]: I1205 11:52:52.545275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af17167db305ed1e07670cb09d0038ec109586440d0e17f60130dfcafe770717"} Dec 05 11:52:52 crc kubenswrapper[4763]: I1205 11:52:52.545284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a12d86369103492b3e10c7addaec9895c296ec2ee570ac9abdd8ad35c981778b"} Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.555792 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.555860 4763 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3" exitCode=1 Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.556006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3"} Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.556670 4763 scope.go:117] "RemoveContainer" containerID="0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3" Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.561405 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"19e683a75f2e2671fa7596b6971d3873a42f469af98ed948a5cf76a0e35d0d30"} Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.561586 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.561734 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:53 crc kubenswrapper[4763]: I1205 11:52:53.561786 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:54 crc kubenswrapper[4763]: I1205 11:52:54.569894 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 11:52:54 crc kubenswrapper[4763]: I1205 11:52:54.570203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1665918a7177c8832e3ec85a5953ae68224eedb3d15205a90dcb78794078c366"} Dec 05 11:52:55 crc kubenswrapper[4763]: I1205 11:52:55.799329 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:55 crc kubenswrapper[4763]: I1205 11:52:55.799369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:55 crc kubenswrapper[4763]: I1205 11:52:55.810323 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:57 crc kubenswrapper[4763]: I1205 11:52:57.037616 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:52:58 crc kubenswrapper[4763]: I1205 11:52:58.580825 4763 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:52:58 crc kubenswrapper[4763]: I1205 11:52:58.672431 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e8a30ec-98f6-4120-8460-df22a8df42c5" Dec 05 11:52:59 crc kubenswrapper[4763]: I1205 11:52:59.601976 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:59 crc kubenswrapper[4763]: I1205 11:52:59.602337 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:52:59 crc kubenswrapper[4763]: I1205 11:52:59.604098 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e8a30ec-98f6-4120-8460-df22a8df42c5" Dec 05 11:52:59 crc kubenswrapper[4763]: I1205 11:52:59.607433 4763 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://a12d86369103492b3e10c7addaec9895c296ec2ee570ac9abdd8ad35c981778b" Dec 05 11:52:59 crc kubenswrapper[4763]: I1205 11:52:59.607460 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.609459 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.609510 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5d7a37e8-1873-49fb-a9cf-4c7a4f98d2c7" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.614094 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e8a30ec-98f6-4120-8460-df22a8df42c5" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.784004 4763 scope.go:117] "RemoveContainer" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.864434 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.864805 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 11:53:00 crc kubenswrapper[4763]: I1205 11:53:00.864909 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 11:53:01 crc kubenswrapper[4763]: I1205 11:53:01.622721 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/1.log" Dec 05 11:53:01 crc kubenswrapper[4763]: I1205 11:53:01.623075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerStarted","Data":"63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c"} Dec 05 11:53:01 crc kubenswrapper[4763]: I1205 11:53:01.623467 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:53:01 crc kubenswrapper[4763]: I1205 11:53:01.818341 4763 patch_prober.go:28] interesting pod/oauth-openshift-66456c6bb-hcc9n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:60936->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 05 11:53:01 crc kubenswrapper[4763]: I1205 11:53:01.818416 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:60936->10.217.0.56:6443: read: connection reset by peer" Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.634413 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/2.log" Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.635573 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/1.log" Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.635615 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe555113-6ec1-4ccb-a29c-7a46458bc380" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" exitCode=255 Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.635654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerDied","Data":"63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c"} Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.635693 4763 scope.go:117] "RemoveContainer" containerID="edf23fa1645c796ea90c151ac3c11513005a1f20a2b3909cbc5f0f399f6ca547" Dec 05 11:53:02 crc kubenswrapper[4763]: I1205 11:53:02.636288 4763 scope.go:117] "RemoveContainer" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" Dec 05 11:53:02 crc kubenswrapper[4763]: E1205 11:53:02.636563 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:53:03 crc kubenswrapper[4763]: I1205 11:53:03.644055 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/2.log" Dec 05 11:53:03 crc kubenswrapper[4763]: I1205 11:53:03.644954 4763 scope.go:117] "RemoveContainer" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" Dec 05 11:53:03 crc kubenswrapper[4763]: E1205 11:53:03.645329 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:53:08 crc kubenswrapper[4763]: I1205 11:53:08.755922 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:53:08 crc kubenswrapper[4763]: I1205 11:53:08.757308 4763 scope.go:117] "RemoveContainer" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" Dec 05 11:53:08 crc kubenswrapper[4763]: E1205 11:53:08.757720 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:53:08 crc kubenswrapper[4763]: I1205 11:53:08.854333 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 11:53:09 crc kubenswrapper[4763]: I1205 11:53:09.382970 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 11:53:09 crc kubenswrapper[4763]: I1205 11:53:09.607198 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.186947 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.453841 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.695922 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.699860 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.774033 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.863259 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.864656 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.864708 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.925040 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 11:53:10 crc kubenswrapper[4763]: I1205 11:53:10.936171 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 11:53:11 crc kubenswrapper[4763]: I1205 11:53:11.675415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 11:53:11 crc kubenswrapper[4763]: I1205 11:53:11.874658 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 11:53:11 crc kubenswrapper[4763]: I1205 11:53:11.985740 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 11:53:11 crc kubenswrapper[4763]: I1205 11:53:11.995920 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.066992 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.195211 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.380749 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.503570 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.607577 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.807345 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.894546 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 11:53:12 crc kubenswrapper[4763]: I1205 11:53:12.985373 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.015287 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.026708 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.179610 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.183331 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.221707 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.247197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.270614 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.345505 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.351262 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.378870 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.575512 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.582395 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.777694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.832344 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.849630 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 11:53:13 crc kubenswrapper[4763]: I1205 11:53:13.928661 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.005741 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.133611 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.137728 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-wndr8"] Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.137806 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.142610 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.154246 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.154229419 podStartE2EDuration="16.154229419s" podCreationTimestamp="2025-12-05 11:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:53:14.152906677 +0000 UTC m=+278.645621420" watchObservedRunningTime="2025-12-05 11:53:14.154229419 +0000 UTC m=+278.646944142" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.198548 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.229114 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.245595 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.258343 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.280722 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.281986 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.292112 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.462245 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.467226 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.528014 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.630986 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.654829 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.665009 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.797691 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.811041 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.817290 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.860781 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.860812 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 11:53:14 crc kubenswrapper[4763]: I1205 11:53:14.957332 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.093092 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.172698 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.217502 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.234196 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.343935 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.385505 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.433754 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.581701 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.622142 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.691549 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.733569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.733692 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.750217 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.773654 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.798503 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b600b871-2ca7-4ca9-ab49-82a77bf73b6a" path="/var/lib/kubelet/pods/b600b871-2ca7-4ca9-ab49-82a77bf73b6a/volumes" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.838817 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 11:53:15 crc kubenswrapper[4763]: I1205 11:53:15.995168 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.024805 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.139512 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.165243 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.249436 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.423790 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.506425 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.515081 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.520975 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.543966 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.616399 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.622186 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.622858 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.636097 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.646064 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.695110 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.695940 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.861073 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.868402 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.895866 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.950234 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.951149 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 11:53:16 crc kubenswrapper[4763]: I1205 11:53:16.951276 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.032243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.076838 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.091255 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.098706 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.138338 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.162631 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.178878 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.199171 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.226978 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.318304 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.320472 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.349291 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.429173 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.511212 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.536811 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.599644 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.600331 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.631218 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.638503 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.653057 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.687320 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.695290 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.804034 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 11:53:17 crc kubenswrapper[4763]: I1205 11:53:17.843481 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:17.875913 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.130839 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.219177 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.252139 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.252139 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.308097 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.308218 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.317528 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.448342 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.667041 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.693013 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.706884 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.835955 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.889327 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:18.929425 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.032198 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.069261 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.107874 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.114830 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.171481 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.280450 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.301663 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.303734 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.334948 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.426611 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.479191 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.559115 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.566897 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.576029 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.581356 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.598783 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.645972 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.748328 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.935537 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:19.985674 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.047229 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.106312 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.179990 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.183227 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.239603 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.257648 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.276489 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.344192 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.408088 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.473786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.506878 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.579980 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.693561 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.699334 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.770801 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.784176 4763 scope.go:117] "RemoveContainer" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" Dec 05 11:53:20 crc kubenswrapper[4763]: E1205 11:53:20.784374 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-66456c6bb-hcc9n_openshift-authentication(fe555113-6ec1-4ccb-a29c-7a46458bc380)\"" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podUID="fe555113-6ec1-4ccb-a29c-7a46458bc380" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.817935 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.843429 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.849430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.864664 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.864724 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.865059 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.865565 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"1665918a7177c8832e3ec85a5953ae68224eedb3d15205a90dcb78794078c366"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.865662 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://1665918a7177c8832e3ec85a5953ae68224eedb3d15205a90dcb78794078c366" gracePeriod=30 Dec 05 11:53:20 crc kubenswrapper[4763]: I1205 11:53:20.971071 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.000009 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.032153 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.081896 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.124549 4763 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.124891 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5" gracePeriod=5 Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.144478 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.176959 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.215694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.274603 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.287807 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.338929 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.356351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.378068 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.402590 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.425795 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.452157 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.466079 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.475825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.530637 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.572237 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.676992 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.697477 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.713751 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.734734 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.869728 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 11:53:21 crc kubenswrapper[4763]: I1205 11:53:21.902014 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.008835 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.083592 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.171003 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.172951 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.226796 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.245252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.258055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.321841 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.387877 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.527192 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.550877 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.718319 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.930021 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.950604 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 11:53:22 crc kubenswrapper[4763]: I1205 11:53:22.969450 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.016492 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.297566 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.313789 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.425812 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.481691 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.532885 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.546750 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.588622 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.647941 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.809304 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.865884 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 11:53:23 crc kubenswrapper[4763]: I1205 11:53:23.927528 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.001062 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.099774 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.100345 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.107435 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.141154 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.183947 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.240949 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.267239 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.381126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.452263 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.515303 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.587666 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.874225 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 11:53:24 crc kubenswrapper[4763]: I1205 11:53:24.912188 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 11:53:25 crc kubenswrapper[4763]: I1205 11:53:25.216232 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 11:53:25 crc kubenswrapper[4763]: I1205 11:53:25.610726 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.126744 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.209607 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.422362 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.495683 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.702287 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.702739 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.749138 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.785538 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.785604 4763 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5" exitCode=137 Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.785651 4763 scope.go:117] "RemoveContainer" containerID="94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.785652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.804458 4763 scope.go:117] "RemoveContainer" containerID="94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5" Dec 05 11:53:26 crc kubenswrapper[4763]: E1205 11:53:26.804992 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5\": container with ID starting with 94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5 not found: ID does not exist" containerID="94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.805053 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5"} err="failed to get container status \"94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5\": rpc error: code = NotFound desc = could not find container \"94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5\": container with ID starting with 94fbc399237786f767b20ed3d30f2b2451e4c0992b5dee4baea970eab6822fb5 not found: ID does not exist" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824728 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824797 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824883 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824939 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.824987 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825006 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825063 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825284 4763 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825299 4763 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825310 4763 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.825318 4763 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.835269 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.926330 4763 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 11:53:26 crc kubenswrapper[4763]: I1205 11:53:26.931276 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 11:53:27 crc kubenswrapper[4763]: I1205 11:53:27.016968 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 11:53:27 crc kubenswrapper[4763]: I1205 11:53:27.438999 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 11:53:27 crc kubenswrapper[4763]: I1205 11:53:27.515897 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 11:53:27 crc kubenswrapper[4763]: I1205 11:53:27.795564 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 11:53:35 crc kubenswrapper[4763]: I1205 11:53:35.791697 4763 scope.go:117] "RemoveContainer" containerID="63c76ecfe8465c0989c74817f8404c0baffb6b7eaf91935bced99bd35734b03c" Dec 05 11:53:36 crc kubenswrapper[4763]: I1205 11:53:36.844967 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66456c6bb-hcc9n_fe555113-6ec1-4ccb-a29c-7a46458bc380/oauth-openshift/2.log" Dec 05 11:53:36 crc kubenswrapper[4763]: I1205 11:53:36.845348 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" event={"ID":"fe555113-6ec1-4ccb-a29c-7a46458bc380","Type":"ContainerStarted","Data":"ced77e9a46f5c7d268d840c51c07f67e7526e9c383e553f439c1b59b143d53bf"} Dec 05 11:53:36 crc kubenswrapper[4763]: I1205 11:53:36.845791 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:53:36 crc kubenswrapper[4763]: I1205 11:53:36.851268 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" Dec 05 11:53:36 crc kubenswrapper[4763]: I1205 11:53:36.876241 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66456c6bb-hcc9n" podStartSLOduration=83.876218745 podStartE2EDuration="1m23.876218745s" podCreationTimestamp="2025-12-05 11:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:53:01.647328598 +0000 UTC m=+266.140043341" watchObservedRunningTime="2025-12-05 11:53:36.876218745 +0000 UTC m=+301.368933508" Dec 05 11:53:39 crc kubenswrapper[4763]: I1205 11:53:39.867751 4763 generic.go:334] "Generic (PLEG): container finished" podID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerID="89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3" exitCode=0 Dec 05 11:53:39 crc kubenswrapper[4763]: I1205 11:53:39.868152 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerDied","Data":"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3"} Dec 05 11:53:39 crc kubenswrapper[4763]: I1205 11:53:39.869016 4763 scope.go:117] "RemoveContainer" containerID="89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3" Dec 05 11:53:40 crc kubenswrapper[4763]: I1205 11:53:40.876446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerStarted","Data":"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa"} Dec 05 11:53:40 crc kubenswrapper[4763]: I1205 11:53:40.877156 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:53:40 crc kubenswrapper[4763]: I1205 11:53:40.881778 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.937406 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.939426 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.939469 4763 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1665918a7177c8832e3ec85a5953ae68224eedb3d15205a90dcb78794078c366" exitCode=137 Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.939496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1665918a7177c8832e3ec85a5953ae68224eedb3d15205a90dcb78794078c366"} Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.939528 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9f95eec7406c73ebc3970029d545c91c61276f39b622f50a0d469a5cc572744e"} Dec 05 11:53:51 crc kubenswrapper[4763]: I1205 11:53:51.939544 4763 scope.go:117] "RemoveContainer" containerID="0370cc7bfd0ce95636924667b4e82dfffd0eee7f07b78d683b386bfee5dc47b3" Dec 05 11:53:52 crc kubenswrapper[4763]: I1205 11:53:52.946839 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 11:53:57 crc kubenswrapper[4763]: I1205 11:53:57.037663 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:54:00 crc kubenswrapper[4763]: I1205 11:54:00.864825 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:54:00 crc kubenswrapper[4763]: I1205 11:54:00.868467 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:54:07 crc kubenswrapper[4763]: I1205 11:54:07.045681 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 11:54:07 crc kubenswrapper[4763]: I1205 11:54:07.544672 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:54:07 crc kubenswrapper[4763]: I1205 11:54:07.544781 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:54:08 crc kubenswrapper[4763]: I1205 11:54:08.852456 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:54:08 crc kubenswrapper[4763]: I1205 11:54:08.852845 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" podUID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" containerName="route-controller-manager" containerID="cri-o://011a3f65814b81db0569e60aa4b8cc8471e8a166f23aba7dd26795af9aa8bc32" gracePeriod=30 Dec 05 11:54:08 crc kubenswrapper[4763]: I1205 11:54:08.865426 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:54:08 crc kubenswrapper[4763]: I1205 11:54:08.865671 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" podUID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" containerName="controller-manager" containerID="cri-o://528270db66d5934466aecbc226025f50d5c67f644e02ab402eb88c67b09d7130" gracePeriod=30 Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.033610 4763 generic.go:334] "Generic (PLEG): container finished" podID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" containerID="528270db66d5934466aecbc226025f50d5c67f644e02ab402eb88c67b09d7130" exitCode=0 Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.033810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" event={"ID":"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88","Type":"ContainerDied","Data":"528270db66d5934466aecbc226025f50d5c67f644e02ab402eb88c67b09d7130"} Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.036300 4763 generic.go:334] "Generic (PLEG): container finished" podID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" containerID="011a3f65814b81db0569e60aa4b8cc8471e8a166f23aba7dd26795af9aa8bc32" exitCode=0 Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.036361 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" event={"ID":"96fb350e-8c37-4e8d-8233-8c3ecfac9935","Type":"ContainerDied","Data":"011a3f65814b81db0569e60aa4b8cc8471e8a166f23aba7dd26795af9aa8bc32"} Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.214626 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.282880 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert\") pod \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.283177 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles\") pod \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.283203 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2c5\" (UniqueName: \"kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5\") pod \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.283240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca\") pod \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.283306 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config\") pod \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\" (UID: \"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.284077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config" (OuterVolumeSpecName: "config") pod "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" (UID: "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.284480 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" (UID: "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.284830 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" (UID: "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.289183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" (UID: "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.289290 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5" (OuterVolumeSpecName: "kube-api-access-bf2c5") pod "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" (UID: "ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88"). InnerVolumeSpecName "kube-api-access-bf2c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.301684 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.383838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4hkr\" (UniqueName: \"kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr\") pod \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.383920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert\") pod \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.383941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config\") pod \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.383959 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca\") pod \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\" (UID: \"96fb350e-8c37-4e8d-8233-8c3ecfac9935\") " Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384176 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384191 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384205 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2c5\" (UniqueName: \"kubernetes.io/projected/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-kube-api-access-bf2c5\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384215 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384222 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.384945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca" (OuterVolumeSpecName: "client-ca") pod "96fb350e-8c37-4e8d-8233-8c3ecfac9935" (UID: "96fb350e-8c37-4e8d-8233-8c3ecfac9935"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.385004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config" (OuterVolumeSpecName: "config") pod "96fb350e-8c37-4e8d-8233-8c3ecfac9935" (UID: "96fb350e-8c37-4e8d-8233-8c3ecfac9935"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.387385 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "96fb350e-8c37-4e8d-8233-8c3ecfac9935" (UID: "96fb350e-8c37-4e8d-8233-8c3ecfac9935"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.387471 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr" (OuterVolumeSpecName: "kube-api-access-k4hkr") pod "96fb350e-8c37-4e8d-8233-8c3ecfac9935" (UID: "96fb350e-8c37-4e8d-8233-8c3ecfac9935"). InnerVolumeSpecName "kube-api-access-k4hkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.485438 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4hkr\" (UniqueName: \"kubernetes.io/projected/96fb350e-8c37-4e8d-8233-8c3ecfac9935-kube-api-access-k4hkr\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.485475 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fb350e-8c37-4e8d-8233-8c3ecfac9935-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.485488 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:09 crc kubenswrapper[4763]: I1205 11:54:09.485498 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96fb350e-8c37-4e8d-8233-8c3ecfac9935-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.046644 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.046621 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sptcn" event={"ID":"ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88","Type":"ContainerDied","Data":"be5914023c699fd9ff536e4aa70785ecf257da92fd1ed9f218fa70f7f6631010"} Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.047866 4763 scope.go:117] "RemoveContainer" containerID="528270db66d5934466aecbc226025f50d5c67f644e02ab402eb88c67b09d7130" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.048919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" event={"ID":"96fb350e-8c37-4e8d-8233-8c3ecfac9935","Type":"ContainerDied","Data":"1dd15b883f8ea4c7d315883fb5c3fea9cc2bef47b70d343d2e332e7d90b6452f"} Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.049003 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.066874 4763 scope.go:117] "RemoveContainer" containerID="011a3f65814b81db0569e60aa4b8cc8471e8a166f23aba7dd26795af9aa8bc32" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.071409 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.077891 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sptcn"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.084890 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.115241 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7h6vz"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322251 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:10 crc kubenswrapper[4763]: E1205 11:54:10.322646 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322674 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 11:54:10 crc kubenswrapper[4763]: E1205 11:54:10.322696 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" containerName="route-controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322709 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" containerName="route-controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: E1205 11:54:10.322727 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" containerName="controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322740 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" containerName="controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: E1205 11:54:10.322785 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" containerName="installer" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322797 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" containerName="installer" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.322996 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.323021 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19b7aa7-7fea-483f-93ba-2278dd8c9ee8" containerName="installer" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.323048 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" containerName="route-controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.323063 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" containerName="controller-manager" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.323691 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.325782 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.326222 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.326287 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.326544 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.326578 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.326881 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.329341 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.330134 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.336371 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.339551 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346168 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346345 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346377 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346408 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346607 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.346616 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.359166 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396636 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjkz\" (UniqueName: \"kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396864 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396893 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.396969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6dr\" (UniqueName: \"kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.397003 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.497974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjkz\" (UniqueName: \"kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498095 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6dr\" (UniqueName: \"kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.498216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.499080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.499669 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.499922 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.500112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.501060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.503230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.503295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.520772 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6dr\" (UniqueName: \"kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr\") pod \"controller-manager-95b665d9-4fsv4\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.524588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjkz\" (UniqueName: \"kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz\") pod \"route-controller-manager-6ccdd5cd9-zlzvg\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.644329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.658588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:10 crc kubenswrapper[4763]: I1205 11:54:10.919251 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.019851 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.060070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" event={"ID":"4b137ba1-1e02-4a02-aba9-aa87d720f889","Type":"ContainerStarted","Data":"9930960c456483e679eb61104fc9f67385bc4f6f4138688d60b263fd24b6dae5"} Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.067068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" event={"ID":"98bc7969-bc68-4339-b02b-75c295a815a9","Type":"ContainerStarted","Data":"f682ee9fca9c5694a4358a9e75b9da5b306997d574ca18c8bbe9a850d886d187"} Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.319707 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.790978 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fb350e-8c37-4e8d-8233-8c3ecfac9935" path="/var/lib/kubelet/pods/96fb350e-8c37-4e8d-8233-8c3ecfac9935/volumes" Dec 05 11:54:11 crc kubenswrapper[4763]: I1205 11:54:11.791752 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88" path="/var/lib/kubelet/pods/ee6a7a6d-7627-4ae7-a0bd-bdcdc660da88/volumes" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.074595 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" event={"ID":"4b137ba1-1e02-4a02-aba9-aa87d720f889","Type":"ContainerStarted","Data":"6a3beb63abe99da83b84bd94c0ad349a8ee18c00644a656720787dba6a8d86d4"} Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.075427 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.077963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" event={"ID":"98bc7969-bc68-4339-b02b-75c295a815a9","Type":"ContainerStarted","Data":"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28"} Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.078815 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.080926 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.084058 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.092934 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" podStartSLOduration=4.092912359 podStartE2EDuration="4.092912359s" podCreationTimestamp="2025-12-05 11:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:12.089303164 +0000 UTC m=+336.582017887" watchObservedRunningTime="2025-12-05 11:54:12.092912359 +0000 UTC m=+336.585627082" Dec 05 11:54:12 crc kubenswrapper[4763]: I1205 11:54:12.143720 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" podStartSLOduration=4.143700526 podStartE2EDuration="4.143700526s" podCreationTimestamp="2025-12-05 11:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:12.142486191 +0000 UTC m=+336.635200914" watchObservedRunningTime="2025-12-05 11:54:12.143700526 +0000 UTC m=+336.636415249" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.082581 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" podUID="98bc7969-bc68-4339-b02b-75c295a815a9" containerName="route-controller-manager" containerID="cri-o://cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28" gracePeriod=30 Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.456618 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.489370 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn"] Dec 05 11:54:13 crc kubenswrapper[4763]: E1205 11:54:13.489572 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bc7969-bc68-4339-b02b-75c295a815a9" containerName="route-controller-manager" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.489584 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bc7969-bc68-4339-b02b-75c295a815a9" containerName="route-controller-manager" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.489670 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bc7969-bc68-4339-b02b-75c295a815a9" containerName="route-controller-manager" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.490053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.509893 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn"] Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544110 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config\") pod \"98bc7969-bc68-4339-b02b-75c295a815a9\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544192 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvjkz\" (UniqueName: \"kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz\") pod \"98bc7969-bc68-4339-b02b-75c295a815a9\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert\") pod \"98bc7969-bc68-4339-b02b-75c295a815a9\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca\") pod \"98bc7969-bc68-4339-b02b-75c295a815a9\" (UID: \"98bc7969-bc68-4339-b02b-75c295a815a9\") " Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7hf5\" (UniqueName: \"kubernetes.io/projected/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-kube-api-access-j7hf5\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-config\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-client-ca\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.544772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-serving-cert\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.545168 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "98bc7969-bc68-4339-b02b-75c295a815a9" (UID: "98bc7969-bc68-4339-b02b-75c295a815a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.545355 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config" (OuterVolumeSpecName: "config") pod "98bc7969-bc68-4339-b02b-75c295a815a9" (UID: "98bc7969-bc68-4339-b02b-75c295a815a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.549736 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98bc7969-bc68-4339-b02b-75c295a815a9" (UID: "98bc7969-bc68-4339-b02b-75c295a815a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.550062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz" (OuterVolumeSpecName: "kube-api-access-fvjkz") pod "98bc7969-bc68-4339-b02b-75c295a815a9" (UID: "98bc7969-bc68-4339-b02b-75c295a815a9"). InnerVolumeSpecName "kube-api-access-fvjkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.646662 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-client-ca\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647436 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-serving-cert\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7hf5\" (UniqueName: \"kubernetes.io/projected/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-kube-api-access-j7hf5\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-config\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647657 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647675 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvjkz\" (UniqueName: \"kubernetes.io/projected/98bc7969-bc68-4339-b02b-75c295a815a9-kube-api-access-fvjkz\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647687 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc7969-bc68-4339-b02b-75c295a815a9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.647699 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98bc7969-bc68-4339-b02b-75c295a815a9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.648501 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-client-ca\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.648974 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-config\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.651810 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-serving-cert\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.671047 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7hf5\" (UniqueName: \"kubernetes.io/projected/2bcd1905-80e7-4bf6-b8da-b353b1d27e20-kube-api-access-j7hf5\") pod \"route-controller-manager-64854f8b9-fz2hn\" (UID: \"2bcd1905-80e7-4bf6-b8da-b353b1d27e20\") " pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:13 crc kubenswrapper[4763]: I1205 11:54:13.819008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.089610 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.089633 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" event={"ID":"98bc7969-bc68-4339-b02b-75c295a815a9","Type":"ContainerDied","Data":"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28"} Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.090107 4763 scope.go:117] "RemoveContainer" containerID="cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28" Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.089551 4763 generic.go:334] "Generic (PLEG): container finished" podID="98bc7969-bc68-4339-b02b-75c295a815a9" containerID="cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28" exitCode=0 Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.090222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg" event={"ID":"98bc7969-bc68-4339-b02b-75c295a815a9","Type":"ContainerDied","Data":"f682ee9fca9c5694a4358a9e75b9da5b306997d574ca18c8bbe9a850d886d187"} Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.112463 4763 scope.go:117] "RemoveContainer" containerID="cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28" Dec 05 11:54:14 crc kubenswrapper[4763]: E1205 11:54:14.112916 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28\": container with ID starting with cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28 not found: ID does not exist" containerID="cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28" Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.112977 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28"} err="failed to get container status \"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28\": rpc error: code = NotFound desc = could not find container \"cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28\": container with ID starting with cb8ba1d2508d7edc4565dc0c03e9808207a93009b0fdb3d677386bc194a40a28 not found: ID does not exist" Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.128695 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.131460 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccdd5cd9-zlzvg"] Dec 05 11:54:14 crc kubenswrapper[4763]: I1205 11:54:14.249974 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn"] Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.100332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" event={"ID":"2bcd1905-80e7-4bf6-b8da-b353b1d27e20","Type":"ContainerStarted","Data":"7cbd8c1f50df021687281d3f9ad5ea1f0e888ab1251ee7f82158461a9c1bd7f9"} Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.100944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" event={"ID":"2bcd1905-80e7-4bf6-b8da-b353b1d27e20","Type":"ContainerStarted","Data":"925063450d59d745d1678d64bc7819f03867ed3721cff301592ee69d2880f8a6"} Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.104654 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.108971 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.134292 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64854f8b9-fz2hn" podStartSLOduration=4.134261859 podStartE2EDuration="4.134261859s" podCreationTimestamp="2025-12-05 11:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:15.128156902 +0000 UTC m=+339.620871625" watchObservedRunningTime="2025-12-05 11:54:15.134261859 +0000 UTC m=+339.626976582" Dec 05 11:54:15 crc kubenswrapper[4763]: I1205 11:54:15.790271 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bc7969-bc68-4339-b02b-75c295a815a9" path="/var/lib/kubelet/pods/98bc7969-bc68-4339-b02b-75c295a815a9/volumes" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.020850 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.021888 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lpk7r" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" containerID="cri-o://430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a" gracePeriod=30 Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.041855 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.042202 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzlbl" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="registry-server" containerID="cri-o://fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" gracePeriod=30 Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.046435 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.046665 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" containerID="cri-o://c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa" gracePeriod=30 Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.061154 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.061476 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rtp7g" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="registry-server" containerID="cri-o://e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457" gracePeriod=30 Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.070737 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.071554 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzfbn" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="registry-server" containerID="cri-o://0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88" gracePeriod=30 Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.103913 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpn4"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.105248 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.116222 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpn4"] Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.140413 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-lpk7r" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" probeResult="failure" output="" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.147610 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-lpk7r" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" probeResult="failure" output="" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.225023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.225548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.225587 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbrb\" (UniqueName: \"kubernetes.io/projected/8fc5438b-109a-4bf8-97a6-d5c49edbc395-kube-api-access-xzbrb\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.326711 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.326794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.326823 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbrb\" (UniqueName: \"kubernetes.io/projected/8fc5438b-109a-4bf8-97a6-d5c49edbc395-kube-api-access-xzbrb\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.328375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.335274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fc5438b-109a-4bf8-97a6-d5c49edbc395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.348111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbrb\" (UniqueName: \"kubernetes.io/projected/8fc5438b-109a-4bf8-97a6-d5c49edbc395-kube-api-access-xzbrb\") pod \"marketplace-operator-79b997595-cqpn4\" (UID: \"8fc5438b-109a-4bf8-97a6-d5c49edbc395\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: E1205 11:54:19.442288 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585f1b1b_1d55_4c5d_be08_5af770eec641.slice/crio-c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6370e4_7a14_43a3_8ab0_c966df3c3e74.slice/crio-conmon-fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585f1b1b_1d55_4c5d_be08_5af770eec641.slice/crio-conmon-c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 11:54:19 crc kubenswrapper[4763]: E1205 11:54:19.515627 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 is running failed: container process not found" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 11:54:19 crc kubenswrapper[4763]: E1205 11:54:19.516192 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 is running failed: container process not found" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 11:54:19 crc kubenswrapper[4763]: E1205 11:54:19.516512 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 is running failed: container process not found" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 11:54:19 crc kubenswrapper[4763]: E1205 11:54:19.516580 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-lzlbl" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="registry-server" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.534939 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.557063 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.640936 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities\") pod \"84b15a6f-ad11-4681-be03-86c7a7f84320\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.641337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content\") pod \"84b15a6f-ad11-4681-be03-86c7a7f84320\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.641477 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hh56\" (UniqueName: \"kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56\") pod \"84b15a6f-ad11-4681-be03-86c7a7f84320\" (UID: \"84b15a6f-ad11-4681-be03-86c7a7f84320\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.643448 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities" (OuterVolumeSpecName: "utilities") pod "84b15a6f-ad11-4681-be03-86c7a7f84320" (UID: "84b15a6f-ad11-4681-be03-86c7a7f84320"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.647170 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56" (OuterVolumeSpecName: "kube-api-access-4hh56") pod "84b15a6f-ad11-4681-be03-86c7a7f84320" (UID: "84b15a6f-ad11-4681-be03-86c7a7f84320"). InnerVolumeSpecName "kube-api-access-4hh56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.662733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84b15a6f-ad11-4681-be03-86c7a7f84320" (UID: "84b15a6f-ad11-4681-be03-86c7a7f84320"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.744053 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.744092 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hh56\" (UniqueName: \"kubernetes.io/projected/84b15a6f-ad11-4681-be03-86c7a7f84320-kube-api-access-4hh56\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.744120 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b15a6f-ad11-4681-be03-86c7a7f84320-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.775816 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.791995 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.811144 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.825697 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.844831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ll8h\" (UniqueName: \"kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h\") pod \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content\") pod \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics\") pod \"585f1b1b-1d55-4c5d-be08-5af770eec641\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845382 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities\") pod \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845496 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content\") pod \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvw9\" (UniqueName: \"kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9\") pod \"585f1b1b-1d55-4c5d-be08-5af770eec641\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845720 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwgtm\" (UniqueName: \"kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm\") pod \"e37aacbd-2b65-4fe9-9874-38a7c585a300\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.845910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content\") pod \"e37aacbd-2b65-4fe9-9874-38a7c585a300\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.846026 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca\") pod \"585f1b1b-1d55-4c5d-be08-5af770eec641\" (UID: \"585f1b1b-1d55-4c5d-be08-5af770eec641\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.846155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws9t9\" (UniqueName: \"kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9\") pod \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\" (UID: \"ea6370e4-7a14-43a3-8ab0-c966df3c3e74\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.846285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities\") pod \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\" (UID: \"61dde3bf-99ba-4d4f-bbbd-91ea145ac314\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.846401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities\") pod \"e37aacbd-2b65-4fe9-9874-38a7c585a300\" (UID: \"e37aacbd-2b65-4fe9-9874-38a7c585a300\") " Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.851951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h" (OuterVolumeSpecName: "kube-api-access-7ll8h") pod "61dde3bf-99ba-4d4f-bbbd-91ea145ac314" (UID: "61dde3bf-99ba-4d4f-bbbd-91ea145ac314"). InnerVolumeSpecName "kube-api-access-7ll8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.853261 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities" (OuterVolumeSpecName: "utilities") pod "e37aacbd-2b65-4fe9-9874-38a7c585a300" (UID: "e37aacbd-2b65-4fe9-9874-38a7c585a300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.854879 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "585f1b1b-1d55-4c5d-be08-5af770eec641" (UID: "585f1b1b-1d55-4c5d-be08-5af770eec641"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.855358 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9" (OuterVolumeSpecName: "kube-api-access-jnvw9") pod "585f1b1b-1d55-4c5d-be08-5af770eec641" (UID: "585f1b1b-1d55-4c5d-be08-5af770eec641"). InnerVolumeSpecName "kube-api-access-jnvw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.856733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities" (OuterVolumeSpecName: "utilities") pod "ea6370e4-7a14-43a3-8ab0-c966df3c3e74" (UID: "ea6370e4-7a14-43a3-8ab0-c966df3c3e74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.858448 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "585f1b1b-1d55-4c5d-be08-5af770eec641" (UID: "585f1b1b-1d55-4c5d-be08-5af770eec641"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.859113 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm" (OuterVolumeSpecName: "kube-api-access-wwgtm") pod "e37aacbd-2b65-4fe9-9874-38a7c585a300" (UID: "e37aacbd-2b65-4fe9-9874-38a7c585a300"). InnerVolumeSpecName "kube-api-access-wwgtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.859168 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities" (OuterVolumeSpecName: "utilities") pod "61dde3bf-99ba-4d4f-bbbd-91ea145ac314" (UID: "61dde3bf-99ba-4d4f-bbbd-91ea145ac314"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.859984 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9" (OuterVolumeSpecName: "kube-api-access-ws9t9") pod "ea6370e4-7a14-43a3-8ab0-c966df3c3e74" (UID: "ea6370e4-7a14-43a3-8ab0-c966df3c3e74"). InnerVolumeSpecName "kube-api-access-ws9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.925453 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea6370e4-7a14-43a3-8ab0-c966df3c3e74" (UID: "ea6370e4-7a14-43a3-8ab0-c966df3c3e74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.947274 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e37aacbd-2b65-4fe9-9874-38a7c585a300" (UID: "e37aacbd-2b65-4fe9-9874-38a7c585a300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948541 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws9t9\" (UniqueName: \"kubernetes.io/projected/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-kube-api-access-ws9t9\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948587 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948603 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948616 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ll8h\" (UniqueName: \"kubernetes.io/projected/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-kube-api-access-7ll8h\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948628 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948643 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948658 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6370e4-7a14-43a3-8ab0-c966df3c3e74-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948673 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvw9\" (UniqueName: \"kubernetes.io/projected/585f1b1b-1d55-4c5d-be08-5af770eec641-kube-api-access-jnvw9\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948683 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwgtm\" (UniqueName: \"kubernetes.io/projected/e37aacbd-2b65-4fe9-9874-38a7c585a300-kube-api-access-wwgtm\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948693 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e37aacbd-2b65-4fe9-9874-38a7c585a300-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:19 crc kubenswrapper[4763]: I1205 11:54:19.948704 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f1b1b-1d55-4c5d-be08-5af770eec641-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.024044 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61dde3bf-99ba-4d4f-bbbd-91ea145ac314" (UID: "61dde3bf-99ba-4d4f-bbbd-91ea145ac314"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.049469 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61dde3bf-99ba-4d4f-bbbd-91ea145ac314-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.135687 4763 generic.go:334] "Generic (PLEG): container finished" podID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerID="0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88" exitCode=0 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.135749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerDied","Data":"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.135812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzfbn" event={"ID":"61dde3bf-99ba-4d4f-bbbd-91ea145ac314","Type":"ContainerDied","Data":"09c901db8d4892fbd4289922c0d1afca932f44651ee1f84b9ba9ded3883af877"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.135834 4763 scope.go:117] "RemoveContainer" containerID="0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.135937 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzfbn" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.152693 4763 generic.go:334] "Generic (PLEG): container finished" podID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" exitCode=0 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.152798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerDied","Data":"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.152833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzlbl" event={"ID":"ea6370e4-7a14-43a3-8ab0-c966df3c3e74","Type":"ContainerDied","Data":"65ead58be9a2f688499f325c400cbca88a9585d3230377cc8067dc5888472a5f"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.152923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzlbl" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.155682 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpn4"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.158114 4763 generic.go:334] "Generic (PLEG): container finished" podID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerID="430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a" exitCode=0 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.158177 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerDied","Data":"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.158207 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpk7r" event={"ID":"e37aacbd-2b65-4fe9-9874-38a7c585a300","Type":"ContainerDied","Data":"a68a5139263f291fb28765d172801eb4bd1759dcf148f0ae7a665a9ad6801913"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.158290 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpk7r" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.161465 4763 generic.go:334] "Generic (PLEG): container finished" podID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerID="e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457" exitCode=0 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.161516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerDied","Data":"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.161537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtp7g" event={"ID":"84b15a6f-ad11-4681-be03-86c7a7f84320","Type":"ContainerDied","Data":"8248093c0c8ae667ce0a768e789dc96f9cf9a0dbdc72a0dea8e302f97804dc7e"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.161620 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtp7g" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.167459 4763 generic.go:334] "Generic (PLEG): container finished" podID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerID="c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa" exitCode=0 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.167502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerDied","Data":"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.167598 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.167757 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hjvs8" event={"ID":"585f1b1b-1d55-4c5d-be08-5af770eec641","Type":"ContainerDied","Data":"49dc0a2e0d7d3417c0b2d05af702c20fb07512e38a5c10b3435cb10526866af3"} Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.168406 4763 scope.go:117] "RemoveContainer" containerID="5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413" Dec 05 11:54:20 crc kubenswrapper[4763]: W1205 11:54:20.177261 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc5438b_109a_4bf8_97a6_d5c49edbc395.slice/crio-ac2c95288dac425e7c0fbcef25011ba0acf022f41d6da505017c4a9b7e4f3bc2 WatchSource:0}: Error finding container ac2c95288dac425e7c0fbcef25011ba0acf022f41d6da505017c4a9b7e4f3bc2: Status 404 returned error can't find the container with id ac2c95288dac425e7c0fbcef25011ba0acf022f41d6da505017c4a9b7e4f3bc2 Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.247751 4763 scope.go:117] "RemoveContainer" containerID="55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.275232 4763 scope.go:117] "RemoveContainer" containerID="0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.275948 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88\": container with ID starting with 0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88 not found: ID does not exist" containerID="0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.276010 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88"} err="failed to get container status \"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88\": rpc error: code = NotFound desc = could not find container \"0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88\": container with ID starting with 0294e4467f92722eac9e0129a2edcc0ee8b941472252d5cf9a77461483160b88 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.276051 4763 scope.go:117] "RemoveContainer" containerID="5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.276533 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413\": container with ID starting with 5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413 not found: ID does not exist" containerID="5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.276720 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413"} err="failed to get container status \"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413\": rpc error: code = NotFound desc = could not find container \"5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413\": container with ID starting with 5073d4f9cb77f5317dc89a412cc7d01e2d7a2cbca91415e3a97d19e62d591413 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.276887 4763 scope.go:117] "RemoveContainer" containerID="55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.277419 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821\": container with ID starting with 55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821 not found: ID does not exist" containerID="55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.277531 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821"} err="failed to get container status \"55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821\": rpc error: code = NotFound desc = could not find container \"55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821\": container with ID starting with 55976a8ec5d7a78f7475bd940af79022e60b186be8d5d57e0c635532973b2821 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.277634 4763 scope.go:117] "RemoveContainer" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.311138 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.315225 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hjvs8"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.317947 4763 scope.go:117] "RemoveContainer" containerID="25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.325076 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.333974 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzfbn"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.345921 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.350375 4763 scope.go:117] "RemoveContainer" containerID="4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.352365 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lpk7r"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.357062 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.368521 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzlbl"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.372988 4763 scope.go:117] "RemoveContainer" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.373466 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81\": container with ID starting with fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 not found: ID does not exist" containerID="fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.373595 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81"} err="failed to get container status \"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81\": rpc error: code = NotFound desc = could not find container \"fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81\": container with ID starting with fcaa1f2b8e088e42ec6dbace61f84ab7bddb5ff1af5b329d84d7c0846b2c6f81 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.373694 4763 scope.go:117] "RemoveContainer" containerID="25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.374214 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28\": container with ID starting with 25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28 not found: ID does not exist" containerID="25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.374275 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28"} err="failed to get container status \"25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28\": rpc error: code = NotFound desc = could not find container \"25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28\": container with ID starting with 25f71cd2999282de820c4979562a6eabc3117f5e80333178c4a28039705e0d28 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.374313 4763 scope.go:117] "RemoveContainer" containerID="4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.374730 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f\": container with ID starting with 4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f not found: ID does not exist" containerID="4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.374802 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f"} err="failed to get container status \"4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f\": rpc error: code = NotFound desc = could not find container \"4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f\": container with ID starting with 4c051a0ff5b54684369a7b6e8255a17d5c549d0af9fd29ce861694839986103f not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.374835 4763 scope.go:117] "RemoveContainer" containerID="430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.374845 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.378038 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtp7g"] Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.387407 4763 scope.go:117] "RemoveContainer" containerID="79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.403286 4763 scope.go:117] "RemoveContainer" containerID="f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.416958 4763 scope.go:117] "RemoveContainer" containerID="430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.417520 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a\": container with ID starting with 430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a not found: ID does not exist" containerID="430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.417641 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a"} err="failed to get container status \"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a\": rpc error: code = NotFound desc = could not find container \"430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a\": container with ID starting with 430864b9a9a363dc8a1d754b411529fb351e404d3af3aec6937fd7e1023ba29a not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.417763 4763 scope.go:117] "RemoveContainer" containerID="79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.418372 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc\": container with ID starting with 79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc not found: ID does not exist" containerID="79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.418413 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc"} err="failed to get container status \"79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc\": rpc error: code = NotFound desc = could not find container \"79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc\": container with ID starting with 79c9038b98d6c03019543d007a6359986abad32d93e62209238d5006af095ffc not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.418444 4763 scope.go:117] "RemoveContainer" containerID="f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.418987 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a\": container with ID starting with f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a not found: ID does not exist" containerID="f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.419068 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a"} err="failed to get container status \"f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a\": rpc error: code = NotFound desc = could not find container \"f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a\": container with ID starting with f72f862bc5ed1aa2f76ab167b1f0deeabfb791453039b23541d066ceb7a55d7a not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.419125 4763 scope.go:117] "RemoveContainer" containerID="e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.437061 4763 scope.go:117] "RemoveContainer" containerID="deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.459414 4763 scope.go:117] "RemoveContainer" containerID="40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.475988 4763 scope.go:117] "RemoveContainer" containerID="e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.476841 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457\": container with ID starting with e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457 not found: ID does not exist" containerID="e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.476882 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457"} err="failed to get container status \"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457\": rpc error: code = NotFound desc = could not find container \"e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457\": container with ID starting with e3b98e7c38caf55ad174a09c8d97ae3dd926cf621355f19059c1540c9dabd457 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.476911 4763 scope.go:117] "RemoveContainer" containerID="deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.477522 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6\": container with ID starting with deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6 not found: ID does not exist" containerID="deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.477576 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6"} err="failed to get container status \"deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6\": rpc error: code = NotFound desc = could not find container \"deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6\": container with ID starting with deabe161e445d319a6e1c928c468a2e231eab6a941af31143ade0657dcde00d6 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.477608 4763 scope.go:117] "RemoveContainer" containerID="40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.478185 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76\": container with ID starting with 40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76 not found: ID does not exist" containerID="40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.478213 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76"} err="failed to get container status \"40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76\": rpc error: code = NotFound desc = could not find container \"40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76\": container with ID starting with 40f2d0e472b1f5bb3b1c0e3665ba694ec796d9e26c57e58801b5a1a4ba7e8a76 not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.478230 4763 scope.go:117] "RemoveContainer" containerID="c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.494821 4763 scope.go:117] "RemoveContainer" containerID="89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.510942 4763 scope.go:117] "RemoveContainer" containerID="c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.511642 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa\": container with ID starting with c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa not found: ID does not exist" containerID="c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.511754 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa"} err="failed to get container status \"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa\": rpc error: code = NotFound desc = could not find container \"c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa\": container with ID starting with c1e69145d090fdc2ba457564c49913ef9b0bf94df797ab7c83bfe7fdbe226cfa not found: ID does not exist" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.511833 4763 scope.go:117] "RemoveContainer" containerID="89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3" Dec 05 11:54:20 crc kubenswrapper[4763]: E1205 11:54:20.512492 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3\": container with ID starting with 89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3 not found: ID does not exist" containerID="89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3" Dec 05 11:54:20 crc kubenswrapper[4763]: I1205 11:54:20.512536 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3"} err="failed to get container status \"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3\": rpc error: code = NotFound desc = could not find container \"89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3\": container with ID starting with 89d680a3a26455a407ccf6650510bff4201a6544c9a577cc6bbfcbfdb05f6af3 not found: ID does not exist" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.175489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" event={"ID":"8fc5438b-109a-4bf8-97a6-d5c49edbc395","Type":"ContainerStarted","Data":"647b4232fa0a3fa82ff08fe6c2eb97e9d27c22ad44755b75bdbddc7199e89058"} Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.175563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" event={"ID":"8fc5438b-109a-4bf8-97a6-d5c49edbc395","Type":"ContainerStarted","Data":"ac2c95288dac425e7c0fbcef25011ba0acf022f41d6da505017c4a9b7e4f3bc2"} Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.177188 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.182712 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.198571 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cqpn4" podStartSLOduration=2.198533186 podStartE2EDuration="2.198533186s" podCreationTimestamp="2025-12-05 11:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:21.198083106 +0000 UTC m=+345.690797849" watchObservedRunningTime="2025-12-05 11:54:21.198533186 +0000 UTC m=+345.691247909" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.791207 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" path="/var/lib/kubelet/pods/585f1b1b-1d55-4c5d-be08-5af770eec641/volumes" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.791982 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" path="/var/lib/kubelet/pods/61dde3bf-99ba-4d4f-bbbd-91ea145ac314/volumes" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.792577 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" path="/var/lib/kubelet/pods/84b15a6f-ad11-4681-be03-86c7a7f84320/volumes" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.793655 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" path="/var/lib/kubelet/pods/e37aacbd-2b65-4fe9-9874-38a7c585a300/volumes" Dec 05 11:54:21 crc kubenswrapper[4763]: I1205 11:54:21.794208 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" path="/var/lib/kubelet/pods/ea6370e4-7a14-43a3-8ab0-c966df3c3e74/volumes" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.671093 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n95pv"] Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673888 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.673907 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673919 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.673926 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.673950 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673960 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.673967 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673977 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.673985 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.673997 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674004 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674013 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674020 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674031 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674038 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674048 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674055 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674067 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674074 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674081 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674089 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="extract-utilities" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674100 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674107 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674115 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674122 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="extract-content" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674232 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674244 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674255 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37aacbd-2b65-4fe9-9874-38a7c585a300" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674267 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b15a6f-ad11-4681-be03-86c7a7f84320" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674280 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="61dde3bf-99ba-4d4f-bbbd-91ea145ac314" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674292 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6370e4-7a14-43a3-8ab0-c966df3c3e74" containerName="registry-server" Dec 05 11:54:27 crc kubenswrapper[4763]: E1205 11:54:27.674399 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.674411 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="585f1b1b-1d55-4c5d-be08-5af770eec641" containerName="marketplace-operator" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.676378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.676891 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n95pv"] Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.682372 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.761066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-catalog-content\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.761166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6x6\" (UniqueName: \"kubernetes.io/projected/9b919307-32f9-4abf-807f-86ef3b67ff55-kube-api-access-kh6x6\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.761198 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-utilities\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.862466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6x6\" (UniqueName: \"kubernetes.io/projected/9b919307-32f9-4abf-807f-86ef3b67ff55-kube-api-access-kh6x6\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.862522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-utilities\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.862560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-catalog-content\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.863145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-catalog-content\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.863590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b919307-32f9-4abf-807f-86ef3b67ff55-utilities\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.869384 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9dpb"] Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.870352 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.874127 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.877689 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9dpb"] Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.890587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6x6\" (UniqueName: \"kubernetes.io/projected/9b919307-32f9-4abf-807f-86ef3b67ff55-kube-api-access-kh6x6\") pod \"certified-operators-n95pv\" (UID: \"9b919307-32f9-4abf-807f-86ef3b67ff55\") " pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.963284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-catalog-content\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.963336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-utilities\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:27 crc kubenswrapper[4763]: I1205 11:54:27.963365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj2m\" (UniqueName: \"kubernetes.io/projected/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-kube-api-access-qrj2m\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.064868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-catalog-content\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.064948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-utilities\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.064991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj2m\" (UniqueName: \"kubernetes.io/projected/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-kube-api-access-qrj2m\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.066388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-catalog-content\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.067630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-utilities\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.088022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj2m\" (UniqueName: \"kubernetes.io/projected/cdc48b79-aeea-4cb0-a97f-4d265bb401f6-kube-api-access-qrj2m\") pod \"community-operators-m9dpb\" (UID: \"cdc48b79-aeea-4cb0-a97f-4d265bb401f6\") " pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.255192 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.262888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.699263 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n95pv"] Dec 05 11:54:28 crc kubenswrapper[4763]: W1205 11:54:28.707506 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b919307_32f9_4abf_807f_86ef3b67ff55.slice/crio-82b0d3d6c4545914e578985172cc254924730bb2e76674569a3e81f4713833c7 WatchSource:0}: Error finding container 82b0d3d6c4545914e578985172cc254924730bb2e76674569a3e81f4713833c7: Status 404 returned error can't find the container with id 82b0d3d6c4545914e578985172cc254924730bb2e76674569a3e81f4713833c7 Dec 05 11:54:28 crc kubenswrapper[4763]: I1205 11:54:28.734460 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9dpb"] Dec 05 11:54:28 crc kubenswrapper[4763]: W1205 11:54:28.742735 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc48b79_aeea_4cb0_a97f_4d265bb401f6.slice/crio-0cb3a9bedd55e9163c48d670aa2297cc0290251c5ce0c66d5baf42096249bab5 WatchSource:0}: Error finding container 0cb3a9bedd55e9163c48d670aa2297cc0290251c5ce0c66d5baf42096249bab5: Status 404 returned error can't find the container with id 0cb3a9bedd55e9163c48d670aa2297cc0290251c5ce0c66d5baf42096249bab5 Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.228676 4763 generic.go:334] "Generic (PLEG): container finished" podID="cdc48b79-aeea-4cb0-a97f-4d265bb401f6" containerID="dc2fa5905c13d9f7fffbe524ee7b441edba3612c58382188273629002146b57a" exitCode=0 Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.228787 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9dpb" event={"ID":"cdc48b79-aeea-4cb0-a97f-4d265bb401f6","Type":"ContainerDied","Data":"dc2fa5905c13d9f7fffbe524ee7b441edba3612c58382188273629002146b57a"} Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.228818 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9dpb" event={"ID":"cdc48b79-aeea-4cb0-a97f-4d265bb401f6","Type":"ContainerStarted","Data":"0cb3a9bedd55e9163c48d670aa2297cc0290251c5ce0c66d5baf42096249bab5"} Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.230995 4763 generic.go:334] "Generic (PLEG): container finished" podID="9b919307-32f9-4abf-807f-86ef3b67ff55" containerID="b7393624756287db2a69befc9e1957f55ef0950902847eab9f303c030aee3647" exitCode=0 Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.231023 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n95pv" event={"ID":"9b919307-32f9-4abf-807f-86ef3b67ff55","Type":"ContainerDied","Data":"b7393624756287db2a69befc9e1957f55ef0950902847eab9f303c030aee3647"} Dec 05 11:54:29 crc kubenswrapper[4763]: I1205 11:54:29.231330 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n95pv" event={"ID":"9b919307-32f9-4abf-807f-86ef3b67ff55","Type":"ContainerStarted","Data":"82b0d3d6c4545914e578985172cc254924730bb2e76674569a3e81f4713833c7"} Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.069067 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6nvlh"] Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.070343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.072236 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.088889 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6nvlh"] Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.191880 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-catalog-content\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.191953 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dpt\" (UniqueName: \"kubernetes.io/projected/1131b919-ad3a-4a36-a38b-de089ae44458-kube-api-access-d7dpt\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.191998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-utilities\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.236512 4763 generic.go:334] "Generic (PLEG): container finished" podID="cdc48b79-aeea-4cb0-a97f-4d265bb401f6" containerID="6790c00b6ecf7740c7759fac3e257cc12728833ec19443bbdbd9f257f96c5844" exitCode=0 Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.236627 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9dpb" event={"ID":"cdc48b79-aeea-4cb0-a97f-4d265bb401f6","Type":"ContainerDied","Data":"6790c00b6ecf7740c7759fac3e257cc12728833ec19443bbdbd9f257f96c5844"} Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.271038 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qr5ln"] Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.272553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.276196 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.280894 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qr5ln"] Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.292708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dpt\" (UniqueName: \"kubernetes.io/projected/1131b919-ad3a-4a36-a38b-de089ae44458-kube-api-access-d7dpt\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.292850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-utilities\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.292927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-catalog-content\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.293903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-catalog-content\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.293943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1131b919-ad3a-4a36-a38b-de089ae44458-utilities\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.313752 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dpt\" (UniqueName: \"kubernetes.io/projected/1131b919-ad3a-4a36-a38b-de089ae44458-kube-api-access-d7dpt\") pod \"redhat-marketplace-6nvlh\" (UID: \"1131b919-ad3a-4a36-a38b-de089ae44458\") " pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.386887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.397116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-utilities\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.397248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrm4\" (UniqueName: \"kubernetes.io/projected/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-kube-api-access-9zrm4\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.397366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-catalog-content\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.498300 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-catalog-content\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.498836 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-utilities\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.498869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrm4\" (UniqueName: \"kubernetes.io/projected/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-kube-api-access-9zrm4\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.499234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-catalog-content\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.499287 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-utilities\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.522991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrm4\" (UniqueName: \"kubernetes.io/projected/91b8e8ef-dab2-4d38-aeaa-7659945ef17e-kube-api-access-9zrm4\") pod \"redhat-operators-qr5ln\" (UID: \"91b8e8ef-dab2-4d38-aeaa-7659945ef17e\") " pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.591236 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:30 crc kubenswrapper[4763]: I1205 11:54:30.793330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6nvlh"] Dec 05 11:54:30 crc kubenswrapper[4763]: W1205 11:54:30.798916 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1131b919_ad3a_4a36_a38b_de089ae44458.slice/crio-944a9ad3db12734cbb850b409d71fd32e893065290e8311c220fc3f62b5df376 WatchSource:0}: Error finding container 944a9ad3db12734cbb850b409d71fd32e893065290e8311c220fc3f62b5df376: Status 404 returned error can't find the container with id 944a9ad3db12734cbb850b409d71fd32e893065290e8311c220fc3f62b5df376 Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.003889 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qr5ln"] Dec 05 11:54:31 crc kubenswrapper[4763]: W1205 11:54:31.008158 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b8e8ef_dab2_4d38_aeaa_7659945ef17e.slice/crio-7489cda0a8d3aaea02c87f702271d73bc094ab7ea66a1a0dd3ff418b2ec920a1 WatchSource:0}: Error finding container 7489cda0a8d3aaea02c87f702271d73bc094ab7ea66a1a0dd3ff418b2ec920a1: Status 404 returned error can't find the container with id 7489cda0a8d3aaea02c87f702271d73bc094ab7ea66a1a0dd3ff418b2ec920a1 Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.042498 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qpwch"] Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.043652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.057209 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qpwch"] Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105178 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-bound-sa-token\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2cv\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-kube-api-access-4l2cv\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-trusted-ca\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-certificates\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105456 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.105492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-tls\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.143203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-certificates\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206456 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-tls\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-bound-sa-token\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206534 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2cv\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-kube-api-access-4l2cv\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.206558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-trusted-ca\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.208265 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-trusted-ca\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.209623 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-certificates\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.209621 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.214713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-registry-tls\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.222416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.224798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-bound-sa-token\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.227185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2cv\" (UniqueName: \"kubernetes.io/projected/4334d073-fdfd-43ba-a9a8-fafcf47f3c74-kube-api-access-4l2cv\") pod \"image-registry-66df7c8f76-qpwch\" (UID: \"4334d073-fdfd-43ba-a9a8-fafcf47f3c74\") " pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.250850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr5ln" event={"ID":"91b8e8ef-dab2-4d38-aeaa-7659945ef17e","Type":"ContainerStarted","Data":"7489cda0a8d3aaea02c87f702271d73bc094ab7ea66a1a0dd3ff418b2ec920a1"} Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.253318 4763 generic.go:334] "Generic (PLEG): container finished" podID="9b919307-32f9-4abf-807f-86ef3b67ff55" containerID="5d7f7493f6fae7705f7f41e7f0572ed06763e0c1bf36cbbacb3dd32d774c7c37" exitCode=0 Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.253388 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n95pv" event={"ID":"9b919307-32f9-4abf-807f-86ef3b67ff55","Type":"ContainerDied","Data":"5d7f7493f6fae7705f7f41e7f0572ed06763e0c1bf36cbbacb3dd32d774c7c37"} Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.254674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6nvlh" event={"ID":"1131b919-ad3a-4a36-a38b-de089ae44458","Type":"ContainerStarted","Data":"944a9ad3db12734cbb850b409d71fd32e893065290e8311c220fc3f62b5df376"} Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.386786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:31 crc kubenswrapper[4763]: I1205 11:54:31.764858 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qpwch"] Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.262435 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9dpb" event={"ID":"cdc48b79-aeea-4cb0-a97f-4d265bb401f6","Type":"ContainerStarted","Data":"5f37c65f58c4f87170a6a88ea7dcbf99925345e47f048112f93fbd3d95f79450"} Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.264482 4763 generic.go:334] "Generic (PLEG): container finished" podID="1131b919-ad3a-4a36-a38b-de089ae44458" containerID="c9ce413f6cc1cbf29d91ab09f67992d367ac9cd9afa609ba4cfdd09f5ed8077d" exitCode=0 Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.264722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6nvlh" event={"ID":"1131b919-ad3a-4a36-a38b-de089ae44458","Type":"ContainerDied","Data":"c9ce413f6cc1cbf29d91ab09f67992d367ac9cd9afa609ba4cfdd09f5ed8077d"} Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.266517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" event={"ID":"4334d073-fdfd-43ba-a9a8-fafcf47f3c74","Type":"ContainerStarted","Data":"abf41494ed5289a6ab50dd5825c0c02bf0e8c4b2f8c688ba0b040d75041d88f3"} Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.266680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" event={"ID":"4334d073-fdfd-43ba-a9a8-fafcf47f3c74","Type":"ContainerStarted","Data":"d4634657f2c25e20f20d1ca09a0796ec69c281064b162e0c56fe1e89d256bed2"} Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.267384 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.269723 4763 generic.go:334] "Generic (PLEG): container finished" podID="91b8e8ef-dab2-4d38-aeaa-7659945ef17e" containerID="9ae1dd59be4a1b5e5fead0f3f22d7209824e6f63298ad6ca68a29422d15a6635" exitCode=0 Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.269776 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr5ln" event={"ID":"91b8e8ef-dab2-4d38-aeaa-7659945ef17e","Type":"ContainerDied","Data":"9ae1dd59be4a1b5e5fead0f3f22d7209824e6f63298ad6ca68a29422d15a6635"} Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.282122 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9dpb" podStartSLOduration=2.505025256 podStartE2EDuration="5.282103307s" podCreationTimestamp="2025-12-05 11:54:27 +0000 UTC" firstStartedPulling="2025-12-05 11:54:29.231118627 +0000 UTC m=+353.723833340" lastFinishedPulling="2025-12-05 11:54:32.008196668 +0000 UTC m=+356.500911391" observedRunningTime="2025-12-05 11:54:32.280656847 +0000 UTC m=+356.773371570" watchObservedRunningTime="2025-12-05 11:54:32.282103307 +0000 UTC m=+356.774818030" Dec 05 11:54:32 crc kubenswrapper[4763]: I1205 11:54:32.299353 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" podStartSLOduration=1.299333226 podStartE2EDuration="1.299333226s" podCreationTimestamp="2025-12-05 11:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:32.296147779 +0000 UTC m=+356.788862532" watchObservedRunningTime="2025-12-05 11:54:32.299333226 +0000 UTC m=+356.792047969" Dec 05 11:54:33 crc kubenswrapper[4763]: I1205 11:54:33.284380 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n95pv" event={"ID":"9b919307-32f9-4abf-807f-86ef3b67ff55","Type":"ContainerStarted","Data":"910b09100644c658f7f64869620fc6901731e41a27fd78c57c6b0294f4ca5519"} Dec 05 11:54:33 crc kubenswrapper[4763]: I1205 11:54:33.313951 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n95pv" podStartSLOduration=3.381009562 podStartE2EDuration="6.313933326s" podCreationTimestamp="2025-12-05 11:54:27 +0000 UTC" firstStartedPulling="2025-12-05 11:54:29.232202089 +0000 UTC m=+353.724916842" lastFinishedPulling="2025-12-05 11:54:32.165125883 +0000 UTC m=+356.657840606" observedRunningTime="2025-12-05 11:54:33.309385952 +0000 UTC m=+357.802100665" watchObservedRunningTime="2025-12-05 11:54:33.313933326 +0000 UTC m=+357.806648049" Dec 05 11:54:35 crc kubenswrapper[4763]: I1205 11:54:35.298944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr5ln" event={"ID":"91b8e8ef-dab2-4d38-aeaa-7659945ef17e","Type":"ContainerStarted","Data":"218e9bdb10eec64a9fbc32ca6cd1d0126216dc985b3fac001c40e1cc1167f7b9"} Dec 05 11:54:35 crc kubenswrapper[4763]: I1205 11:54:35.302677 4763 generic.go:334] "Generic (PLEG): container finished" podID="1131b919-ad3a-4a36-a38b-de089ae44458" containerID="951c217756e8f1132c6a152d86dca31d7f5c8723ddd428fa54e33125432486b8" exitCode=0 Dec 05 11:54:35 crc kubenswrapper[4763]: I1205 11:54:35.302719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6nvlh" event={"ID":"1131b919-ad3a-4a36-a38b-de089ae44458","Type":"ContainerDied","Data":"951c217756e8f1132c6a152d86dca31d7f5c8723ddd428fa54e33125432486b8"} Dec 05 11:54:36 crc kubenswrapper[4763]: I1205 11:54:36.309797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6nvlh" event={"ID":"1131b919-ad3a-4a36-a38b-de089ae44458","Type":"ContainerStarted","Data":"4fb8481152a018eb57aae9792a162003bd522cccdcdc20eda478d30d807bd2b8"} Dec 05 11:54:36 crc kubenswrapper[4763]: I1205 11:54:36.312843 4763 generic.go:334] "Generic (PLEG): container finished" podID="91b8e8ef-dab2-4d38-aeaa-7659945ef17e" containerID="218e9bdb10eec64a9fbc32ca6cd1d0126216dc985b3fac001c40e1cc1167f7b9" exitCode=0 Dec 05 11:54:36 crc kubenswrapper[4763]: I1205 11:54:36.312890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr5ln" event={"ID":"91b8e8ef-dab2-4d38-aeaa-7659945ef17e","Type":"ContainerDied","Data":"218e9bdb10eec64a9fbc32ca6cd1d0126216dc985b3fac001c40e1cc1167f7b9"} Dec 05 11:54:36 crc kubenswrapper[4763]: I1205 11:54:36.329260 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6nvlh" podStartSLOduration=2.57492839 podStartE2EDuration="6.329244862s" podCreationTimestamp="2025-12-05 11:54:30 +0000 UTC" firstStartedPulling="2025-12-05 11:54:32.265859929 +0000 UTC m=+356.758574652" lastFinishedPulling="2025-12-05 11:54:36.020176391 +0000 UTC m=+360.512891124" observedRunningTime="2025-12-05 11:54:36.327903135 +0000 UTC m=+360.820617868" watchObservedRunningTime="2025-12-05 11:54:36.329244862 +0000 UTC m=+360.821959585" Dec 05 11:54:37 crc kubenswrapper[4763]: I1205 11:54:37.544487 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:54:37 crc kubenswrapper[4763]: I1205 11:54:37.544788 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.255824 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.255896 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.263683 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.263737 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.293338 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.304300 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.325397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr5ln" event={"ID":"91b8e8ef-dab2-4d38-aeaa-7659945ef17e","Type":"ContainerStarted","Data":"ed0e10085e184c8cc6bb74e0f3d7ac27402f17beeb8e0ef37166b33d3535dd49"} Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.369093 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n95pv" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.371301 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9dpb" Dec 05 11:54:38 crc kubenswrapper[4763]: I1205 11:54:38.385235 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qr5ln" podStartSLOduration=3.371268359 podStartE2EDuration="8.385218001s" podCreationTimestamp="2025-12-05 11:54:30 +0000 UTC" firstStartedPulling="2025-12-05 11:54:32.272292553 +0000 UTC m=+356.765007276" lastFinishedPulling="2025-12-05 11:54:37.286242205 +0000 UTC m=+361.778956918" observedRunningTime="2025-12-05 11:54:38.351955749 +0000 UTC m=+362.844670472" watchObservedRunningTime="2025-12-05 11:54:38.385218001 +0000 UTC m=+362.877932724" Dec 05 11:54:40 crc kubenswrapper[4763]: I1205 11:54:40.387076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:40 crc kubenswrapper[4763]: I1205 11:54:40.387961 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:40 crc kubenswrapper[4763]: I1205 11:54:40.428029 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:40 crc kubenswrapper[4763]: I1205 11:54:40.591543 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:40 crc kubenswrapper[4763]: I1205 11:54:40.591924 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:41 crc kubenswrapper[4763]: I1205 11:54:41.427966 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6nvlh" Dec 05 11:54:41 crc kubenswrapper[4763]: I1205 11:54:41.627797 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qr5ln" podUID="91b8e8ef-dab2-4d38-aeaa-7659945ef17e" containerName="registry-server" probeResult="failure" output=< Dec 05 11:54:41 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 11:54:41 crc kubenswrapper[4763]: > Dec 05 11:54:50 crc kubenswrapper[4763]: I1205 11:54:50.627017 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:50 crc kubenswrapper[4763]: I1205 11:54:50.687488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qr5ln" Dec 05 11:54:51 crc kubenswrapper[4763]: I1205 11:54:51.326754 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:51 crc kubenswrapper[4763]: I1205 11:54:51.327420 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" podUID="4b137ba1-1e02-4a02-aba9-aa87d720f889" containerName="controller-manager" containerID="cri-o://6a3beb63abe99da83b84bd94c0ad349a8ee18c00644a656720787dba6a8d86d4" gracePeriod=30 Dec 05 11:54:51 crc kubenswrapper[4763]: I1205 11:54:51.394066 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-qpwch" Dec 05 11:54:51 crc kubenswrapper[4763]: I1205 11:54:51.470533 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:54:53 crc kubenswrapper[4763]: I1205 11:54:53.399354 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b137ba1-1e02-4a02-aba9-aa87d720f889" containerID="6a3beb63abe99da83b84bd94c0ad349a8ee18c00644a656720787dba6a8d86d4" exitCode=0 Dec 05 11:54:53 crc kubenswrapper[4763]: I1205 11:54:53.399400 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" event={"ID":"4b137ba1-1e02-4a02-aba9-aa87d720f889","Type":"ContainerDied","Data":"6a3beb63abe99da83b84bd94c0ad349a8ee18c00644a656720787dba6a8d86d4"} Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.182056 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.210847 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-864475d5f5-c24r8"] Dec 05 11:54:54 crc kubenswrapper[4763]: E1205 11:54:54.211087 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b137ba1-1e02-4a02-aba9-aa87d720f889" containerName="controller-manager" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.211099 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b137ba1-1e02-4a02-aba9-aa87d720f889" containerName="controller-manager" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.211245 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b137ba1-1e02-4a02-aba9-aa87d720f889" containerName="controller-manager" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.211725 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.223579 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-864475d5f5-c24r8"] Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342115 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca\") pod \"4b137ba1-1e02-4a02-aba9-aa87d720f889\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert\") pod \"4b137ba1-1e02-4a02-aba9-aa87d720f889\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342209 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn6dr\" (UniqueName: \"kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr\") pod \"4b137ba1-1e02-4a02-aba9-aa87d720f889\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles\") pod \"4b137ba1-1e02-4a02-aba9-aa87d720f889\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342275 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config\") pod \"4b137ba1-1e02-4a02-aba9-aa87d720f889\" (UID: \"4b137ba1-1e02-4a02-aba9-aa87d720f889\") " Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-config\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564686c3-8101-46de-b7ed-6a3991e671c8-serving-cert\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-client-ca\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-proxy-ca-bundles\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.342558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw54m\" (UniqueName: \"kubernetes.io/projected/564686c3-8101-46de-b7ed-6a3991e671c8-kube-api-access-zw54m\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.343397 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4b137ba1-1e02-4a02-aba9-aa87d720f889" (UID: "4b137ba1-1e02-4a02-aba9-aa87d720f889"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.343410 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b137ba1-1e02-4a02-aba9-aa87d720f889" (UID: "4b137ba1-1e02-4a02-aba9-aa87d720f889"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.343416 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config" (OuterVolumeSpecName: "config") pod "4b137ba1-1e02-4a02-aba9-aa87d720f889" (UID: "4b137ba1-1e02-4a02-aba9-aa87d720f889"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.348526 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b137ba1-1e02-4a02-aba9-aa87d720f889" (UID: "4b137ba1-1e02-4a02-aba9-aa87d720f889"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.348653 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr" (OuterVolumeSpecName: "kube-api-access-cn6dr") pod "4b137ba1-1e02-4a02-aba9-aa87d720f889" (UID: "4b137ba1-1e02-4a02-aba9-aa87d720f889"). InnerVolumeSpecName "kube-api-access-cn6dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.405576 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" event={"ID":"4b137ba1-1e02-4a02-aba9-aa87d720f889","Type":"ContainerDied","Data":"9930960c456483e679eb61104fc9f67385bc4f6f4138688d60b263fd24b6dae5"} Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.405632 4763 scope.go:117] "RemoveContainer" containerID="6a3beb63abe99da83b84bd94c0ad349a8ee18c00644a656720787dba6a8d86d4" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.405631 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-95b665d9-4fsv4" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.437535 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.442129 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-95b665d9-4fsv4"] Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564686c3-8101-46de-b7ed-6a3991e671c8-serving-cert\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-client-ca\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-proxy-ca-bundles\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw54m\" (UniqueName: \"kubernetes.io/projected/564686c3-8101-46de-b7ed-6a3991e671c8-kube-api-access-zw54m\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-config\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444307 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444317 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b137ba1-1e02-4a02-aba9-aa87d720f889-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444327 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn6dr\" (UniqueName: \"kubernetes.io/projected/4b137ba1-1e02-4a02-aba9-aa87d720f889-kube-api-access-cn6dr\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444338 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.444347 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b137ba1-1e02-4a02-aba9-aa87d720f889-config\") on node \"crc\" DevicePath \"\"" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.445224 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-client-ca\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.445630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-proxy-ca-bundles\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.445699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564686c3-8101-46de-b7ed-6a3991e671c8-config\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.448849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564686c3-8101-46de-b7ed-6a3991e671c8-serving-cert\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.470605 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw54m\" (UniqueName: \"kubernetes.io/projected/564686c3-8101-46de-b7ed-6a3991e671c8-kube-api-access-zw54m\") pod \"controller-manager-864475d5f5-c24r8\" (UID: \"564686c3-8101-46de-b7ed-6a3991e671c8\") " pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.561506 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:54 crc kubenswrapper[4763]: I1205 11:54:54.952535 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-864475d5f5-c24r8"] Dec 05 11:54:54 crc kubenswrapper[4763]: W1205 11:54:54.958070 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564686c3_8101_46de_b7ed_6a3991e671c8.slice/crio-0361110148c6c50041734e998e7e6e9bf7a26cbcb1c18d8f360b2ebf40f3e07c WatchSource:0}: Error finding container 0361110148c6c50041734e998e7e6e9bf7a26cbcb1c18d8f360b2ebf40f3e07c: Status 404 returned error can't find the container with id 0361110148c6c50041734e998e7e6e9bf7a26cbcb1c18d8f360b2ebf40f3e07c Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.411813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" event={"ID":"564686c3-8101-46de-b7ed-6a3991e671c8","Type":"ContainerStarted","Data":"dd051bf705a03ae0641f9d47b19b1978179a1b1b8150c2d8606e56a47f431c33"} Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.412169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" event={"ID":"564686c3-8101-46de-b7ed-6a3991e671c8","Type":"ContainerStarted","Data":"0361110148c6c50041734e998e7e6e9bf7a26cbcb1c18d8f360b2ebf40f3e07c"} Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.412192 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.416137 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.432725 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-864475d5f5-c24r8" podStartSLOduration=4.43270563 podStartE2EDuration="4.43270563s" podCreationTimestamp="2025-12-05 11:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 11:54:55.430383382 +0000 UTC m=+379.923098105" watchObservedRunningTime="2025-12-05 11:54:55.43270563 +0000 UTC m=+379.925420353" Dec 05 11:54:55 crc kubenswrapper[4763]: I1205 11:54:55.790066 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b137ba1-1e02-4a02-aba9-aa87d720f889" path="/var/lib/kubelet/pods/4b137ba1-1e02-4a02-aba9-aa87d720f889/volumes" Dec 05 11:55:07 crc kubenswrapper[4763]: I1205 11:55:07.543851 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:55:07 crc kubenswrapper[4763]: I1205 11:55:07.544451 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:55:07 crc kubenswrapper[4763]: I1205 11:55:07.544513 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:55:07 crc kubenswrapper[4763]: I1205 11:55:07.545155 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 11:55:07 crc kubenswrapper[4763]: I1205 11:55:07.545225 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6" gracePeriod=600 Dec 05 11:55:08 crc kubenswrapper[4763]: I1205 11:55:08.487157 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6" exitCode=0 Dec 05 11:55:08 crc kubenswrapper[4763]: I1205 11:55:08.487252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6"} Dec 05 11:55:08 crc kubenswrapper[4763]: I1205 11:55:08.487693 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c"} Dec 05 11:55:08 crc kubenswrapper[4763]: I1205 11:55:08.487717 4763 scope.go:117] "RemoveContainer" containerID="eb8a3a71da68a51eabe1aaa24fffd1b84c35cb01945642622f4100125ee26223" Dec 05 11:55:16 crc kubenswrapper[4763]: I1205 11:55:16.506233 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" podUID="aa31a254-af8d-4f9f-b22d-1844d7d60382" containerName="registry" containerID="cri-o://5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017" gracePeriod=30 Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.110850 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172564 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p4jp\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.172727 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls\") pod \"aa31a254-af8d-4f9f-b22d-1844d7d60382\" (UID: \"aa31a254-af8d-4f9f-b22d-1844d7d60382\") " Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.175442 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.176316 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.178965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.179652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.180746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp" (OuterVolumeSpecName: "kube-api-access-6p4jp") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "kube-api-access-6p4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.181607 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.188424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.193180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aa31a254-af8d-4f9f-b22d-1844d7d60382" (UID: "aa31a254-af8d-4f9f-b22d-1844d7d60382"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273906 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa31a254-af8d-4f9f-b22d-1844d7d60382-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273944 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273953 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p4jp\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-kube-api-access-6p4jp\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273965 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa31a254-af8d-4f9f-b22d-1844d7d60382-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273975 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273983 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa31a254-af8d-4f9f-b22d-1844d7d60382-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.273991 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa31a254-af8d-4f9f-b22d-1844d7d60382-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.553571 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa31a254-af8d-4f9f-b22d-1844d7d60382" containerID="5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017" exitCode=0 Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.553611 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" event={"ID":"aa31a254-af8d-4f9f-b22d-1844d7d60382","Type":"ContainerDied","Data":"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017"} Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.553635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" event={"ID":"aa31a254-af8d-4f9f-b22d-1844d7d60382","Type":"ContainerDied","Data":"ca2e55b218df9765469d149a47aa254308347bf936552dab4e3d577132d0d7c9"} Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.553650 4763 scope.go:117] "RemoveContainer" containerID="5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.553739 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpf4b" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.581509 4763 scope.go:117] "RemoveContainer" containerID="5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017" Dec 05 11:55:17 crc kubenswrapper[4763]: E1205 11:55:17.582207 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017\": container with ID starting with 5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017 not found: ID does not exist" containerID="5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.582241 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017"} err="failed to get container status \"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017\": rpc error: code = NotFound desc = could not find container \"5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017\": container with ID starting with 5dbcca1c9e333b585d4c50250c07c4e202e2a44c67176b6df2ac066e40ac8017 not found: ID does not exist" Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.586731 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.595376 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpf4b"] Dec 05 11:55:17 crc kubenswrapper[4763]: I1205 11:55:17.793828 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa31a254-af8d-4f9f-b22d-1844d7d60382" path="/var/lib/kubelet/pods/aa31a254-af8d-4f9f-b22d-1844d7d60382/volumes" Dec 05 11:57:35 crc kubenswrapper[4763]: I1205 11:57:35.990428 4763 scope.go:117] "RemoveContainer" containerID="57a1fda41b6aca3107599caac8502a683367d23210617ad68d83342fc7eddc4c" Dec 05 11:57:37 crc kubenswrapper[4763]: I1205 11:57:37.544032 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:57:37 crc kubenswrapper[4763]: I1205 11:57:37.544102 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:58:07 crc kubenswrapper[4763]: I1205 11:58:07.544165 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:58:07 crc kubenswrapper[4763]: I1205 11:58:07.544688 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:58:37 crc kubenswrapper[4763]: I1205 11:58:37.543908 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 11:58:37 crc kubenswrapper[4763]: I1205 11:58:37.544584 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 11:58:37 crc kubenswrapper[4763]: I1205 11:58:37.544656 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 11:58:37 crc kubenswrapper[4763]: I1205 11:58:37.545618 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 11:58:37 crc kubenswrapper[4763]: I1205 11:58:37.545728 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c" gracePeriod=600 Dec 05 11:58:38 crc kubenswrapper[4763]: I1205 11:58:38.670520 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c" exitCode=0 Dec 05 11:58:38 crc kubenswrapper[4763]: I1205 11:58:38.670589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c"} Dec 05 11:58:38 crc kubenswrapper[4763]: I1205 11:58:38.671198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6"} Dec 05 11:58:38 crc kubenswrapper[4763]: I1205 11:58:38.671250 4763 scope.go:117] "RemoveContainer" containerID="534307a53a349d3e6f626a6d8dc4de67404cbc863e94b63e58ef318db5a175f6" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.174140 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk"] Dec 05 12:00:00 crc kubenswrapper[4763]: E1205 12:00:00.175217 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa31a254-af8d-4f9f-b22d-1844d7d60382" containerName="registry" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.175233 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa31a254-af8d-4f9f-b22d-1844d7d60382" containerName="registry" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.175324 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa31a254-af8d-4f9f-b22d-1844d7d60382" containerName="registry" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.175735 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.178170 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.178221 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.184873 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk"] Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.348787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.348885 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.348909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5jmh\" (UniqueName: \"kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.450377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.450433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5jmh\" (UniqueName: \"kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.450487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.451358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.466867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5jmh\" (UniqueName: \"kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.470523 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume\") pod \"collect-profiles-29415600-7v5hk\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.496141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:00 crc kubenswrapper[4763]: I1205 12:00:00.682322 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk"] Dec 05 12:00:01 crc kubenswrapper[4763]: I1205 12:00:01.095120 4763 generic.go:334] "Generic (PLEG): container finished" podID="383179f3-9d26-4bf0-ae07-1c96400ecf60" containerID="284b2ace5698f5ba314d5d1742d62f11ca2acf78da9116d53ff093f544f1857d" exitCode=0 Dec 05 12:00:01 crc kubenswrapper[4763]: I1205 12:00:01.095307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" event={"ID":"383179f3-9d26-4bf0-ae07-1c96400ecf60","Type":"ContainerDied","Data":"284b2ace5698f5ba314d5d1742d62f11ca2acf78da9116d53ff093f544f1857d"} Dec 05 12:00:01 crc kubenswrapper[4763]: I1205 12:00:01.095474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" event={"ID":"383179f3-9d26-4bf0-ae07-1c96400ecf60","Type":"ContainerStarted","Data":"e7a00de0daf07e5454884b4840798e65389c9179afb47e0e7c458202d5b68579"} Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.296448 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.479949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume\") pod \"383179f3-9d26-4bf0-ae07-1c96400ecf60\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.480089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume\") pod \"383179f3-9d26-4bf0-ae07-1c96400ecf60\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.480121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5jmh\" (UniqueName: \"kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh\") pod \"383179f3-9d26-4bf0-ae07-1c96400ecf60\" (UID: \"383179f3-9d26-4bf0-ae07-1c96400ecf60\") " Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.481060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume" (OuterVolumeSpecName: "config-volume") pod "383179f3-9d26-4bf0-ae07-1c96400ecf60" (UID: "383179f3-9d26-4bf0-ae07-1c96400ecf60"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.485199 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "383179f3-9d26-4bf0-ae07-1c96400ecf60" (UID: "383179f3-9d26-4bf0-ae07-1c96400ecf60"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.485472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh" (OuterVolumeSpecName: "kube-api-access-b5jmh") pod "383179f3-9d26-4bf0-ae07-1c96400ecf60" (UID: "383179f3-9d26-4bf0-ae07-1c96400ecf60"). InnerVolumeSpecName "kube-api-access-b5jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.581317 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/383179f3-9d26-4bf0-ae07-1c96400ecf60-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.581367 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/383179f3-9d26-4bf0-ae07-1c96400ecf60-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:02 crc kubenswrapper[4763]: I1205 12:00:02.581377 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5jmh\" (UniqueName: \"kubernetes.io/projected/383179f3-9d26-4bf0-ae07-1c96400ecf60-kube-api-access-b5jmh\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:03 crc kubenswrapper[4763]: I1205 12:00:03.109184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" event={"ID":"383179f3-9d26-4bf0-ae07-1c96400ecf60","Type":"ContainerDied","Data":"e7a00de0daf07e5454884b4840798e65389c9179afb47e0e7c458202d5b68579"} Dec 05 12:00:03 crc kubenswrapper[4763]: I1205 12:00:03.109240 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7a00de0daf07e5454884b4840798e65389c9179afb47e0e7c458202d5b68579" Dec 05 12:00:03 crc kubenswrapper[4763]: I1205 12:00:03.109313 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.681316 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d5g4s"] Dec 05 12:00:27 crc kubenswrapper[4763]: E1205 12:00:27.682108 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383179f3-9d26-4bf0-ae07-1c96400ecf60" containerName="collect-profiles" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.682123 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="383179f3-9d26-4bf0-ae07-1c96400ecf60" containerName="collect-profiles" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.682246 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="383179f3-9d26-4bf0-ae07-1c96400ecf60" containerName="collect-profiles" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.682699 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.685044 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dcg55"] Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.685672 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dcg55" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.687169 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jbsg2" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.687501 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.687755 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.688328 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8bqwp" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.698685 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d5g4s"] Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.703133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dcg55"] Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.710187 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5x8rx"] Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.711015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.722169 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l6ck9" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.727457 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5x8rx"] Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.798560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxlc\" (UniqueName: \"kubernetes.io/projected/96780413-b18b-4d1d-a6c4-2bebb60c99c1-kube-api-access-xsxlc\") pod \"cert-manager-5b446d88c5-dcg55\" (UID: \"96780413-b18b-4d1d-a6c4-2bebb60c99c1\") " pod="cert-manager/cert-manager-5b446d88c5-dcg55" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.798627 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49w4\" (UniqueName: \"kubernetes.io/projected/c47efa3a-fd06-4193-921d-11f8f5fb0eff-kube-api-access-l49w4\") pod \"cert-manager-cainjector-7f985d654d-d5g4s\" (UID: \"c47efa3a-fd06-4193-921d-11f8f5fb0eff\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.798726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4c45\" (UniqueName: \"kubernetes.io/projected/6f795519-6cee-426c-8dda-7f96ef62a9a1-kube-api-access-p4c45\") pod \"cert-manager-webhook-5655c58dd6-5x8rx\" (UID: \"6f795519-6cee-426c-8dda-7f96ef62a9a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.899508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxlc\" (UniqueName: \"kubernetes.io/projected/96780413-b18b-4d1d-a6c4-2bebb60c99c1-kube-api-access-xsxlc\") pod \"cert-manager-5b446d88c5-dcg55\" (UID: \"96780413-b18b-4d1d-a6c4-2bebb60c99c1\") " pod="cert-manager/cert-manager-5b446d88c5-dcg55" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.899823 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49w4\" (UniqueName: \"kubernetes.io/projected/c47efa3a-fd06-4193-921d-11f8f5fb0eff-kube-api-access-l49w4\") pod \"cert-manager-cainjector-7f985d654d-d5g4s\" (UID: \"c47efa3a-fd06-4193-921d-11f8f5fb0eff\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.899983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4c45\" (UniqueName: \"kubernetes.io/projected/6f795519-6cee-426c-8dda-7f96ef62a9a1-kube-api-access-p4c45\") pod \"cert-manager-webhook-5655c58dd6-5x8rx\" (UID: \"6f795519-6cee-426c-8dda-7f96ef62a9a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.917274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxlc\" (UniqueName: \"kubernetes.io/projected/96780413-b18b-4d1d-a6c4-2bebb60c99c1-kube-api-access-xsxlc\") pod \"cert-manager-5b446d88c5-dcg55\" (UID: \"96780413-b18b-4d1d-a6c4-2bebb60c99c1\") " pod="cert-manager/cert-manager-5b446d88c5-dcg55" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.918488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49w4\" (UniqueName: \"kubernetes.io/projected/c47efa3a-fd06-4193-921d-11f8f5fb0eff-kube-api-access-l49w4\") pod \"cert-manager-cainjector-7f985d654d-d5g4s\" (UID: \"c47efa3a-fd06-4193-921d-11f8f5fb0eff\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" Dec 05 12:00:27 crc kubenswrapper[4763]: I1205 12:00:27.919173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4c45\" (UniqueName: \"kubernetes.io/projected/6f795519-6cee-426c-8dda-7f96ef62a9a1-kube-api-access-p4c45\") pod \"cert-manager-webhook-5655c58dd6-5x8rx\" (UID: \"6f795519-6cee-426c-8dda-7f96ef62a9a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.007866 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.018169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dcg55" Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.030184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.437548 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d5g4s"] Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.444129 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.474091 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dcg55"] Dec 05 12:00:28 crc kubenswrapper[4763]: W1205 12:00:28.476066 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96780413_b18b_4d1d_a6c4_2bebb60c99c1.slice/crio-826e4adeb28618e1c281b8d147c17a056b1d7765f84c0d5a578ebae89ec3b232 WatchSource:0}: Error finding container 826e4adeb28618e1c281b8d147c17a056b1d7765f84c0d5a578ebae89ec3b232: Status 404 returned error can't find the container with id 826e4adeb28618e1c281b8d147c17a056b1d7765f84c0d5a578ebae89ec3b232 Dec 05 12:00:28 crc kubenswrapper[4763]: I1205 12:00:28.524296 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5x8rx"] Dec 05 12:00:28 crc kubenswrapper[4763]: W1205 12:00:28.529721 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f795519_6cee_426c_8dda_7f96ef62a9a1.slice/crio-89bb35c01f8ea5d71e5f79d225f3f9ef20fefa5f5b2458fb4f8ed0578ee3c681 WatchSource:0}: Error finding container 89bb35c01f8ea5d71e5f79d225f3f9ef20fefa5f5b2458fb4f8ed0578ee3c681: Status 404 returned error can't find the container with id 89bb35c01f8ea5d71e5f79d225f3f9ef20fefa5f5b2458fb4f8ed0578ee3c681 Dec 05 12:00:29 crc kubenswrapper[4763]: I1205 12:00:29.239247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" event={"ID":"c47efa3a-fd06-4193-921d-11f8f5fb0eff","Type":"ContainerStarted","Data":"39c0302a6ee0ba191726e9a5f460319d28f7fac5700e63aa9cf118d81538c5fd"} Dec 05 12:00:29 crc kubenswrapper[4763]: I1205 12:00:29.241078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dcg55" event={"ID":"96780413-b18b-4d1d-a6c4-2bebb60c99c1","Type":"ContainerStarted","Data":"826e4adeb28618e1c281b8d147c17a056b1d7765f84c0d5a578ebae89ec3b232"} Dec 05 12:00:29 crc kubenswrapper[4763]: I1205 12:00:29.241960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" event={"ID":"6f795519-6cee-426c-8dda-7f96ef62a9a1","Type":"ContainerStarted","Data":"89bb35c01f8ea5d71e5f79d225f3f9ef20fefa5f5b2458fb4f8ed0578ee3c681"} Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.265316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dcg55" event={"ID":"96780413-b18b-4d1d-a6c4-2bebb60c99c1","Type":"ContainerStarted","Data":"3bb5c058fa7b1b91b33a1f21de90942d478f9ef212ea7844d551676d90a056df"} Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.268306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" event={"ID":"6f795519-6cee-426c-8dda-7f96ef62a9a1","Type":"ContainerStarted","Data":"d74a0d81c3bbd2f527e5370ae5eb0de5cee06e141360b04350169338a1b0dd10"} Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.268448 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.269806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" event={"ID":"c47efa3a-fd06-4193-921d-11f8f5fb0eff","Type":"ContainerStarted","Data":"2eeaf6d27a68646f5a3b5c6d9261867d7edfa50876b80e5aea6780184a40dae2"} Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.282178 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-dcg55" podStartSLOduration=2.074650986 podStartE2EDuration="5.282162045s" podCreationTimestamp="2025-12-05 12:00:27 +0000 UTC" firstStartedPulling="2025-12-05 12:00:28.478116253 +0000 UTC m=+712.970830976" lastFinishedPulling="2025-12-05 12:00:31.685627302 +0000 UTC m=+716.178342035" observedRunningTime="2025-12-05 12:00:32.280117834 +0000 UTC m=+716.772832557" watchObservedRunningTime="2025-12-05 12:00:32.282162045 +0000 UTC m=+716.774876768" Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.310206 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-d5g4s" podStartSLOduration=2.07245964 podStartE2EDuration="5.310185167s" podCreationTimestamp="2025-12-05 12:00:27 +0000 UTC" firstStartedPulling="2025-12-05 12:00:28.443928783 +0000 UTC m=+712.936643506" lastFinishedPulling="2025-12-05 12:00:31.68165431 +0000 UTC m=+716.174369033" observedRunningTime="2025-12-05 12:00:32.293317882 +0000 UTC m=+716.786032605" watchObservedRunningTime="2025-12-05 12:00:32.310185167 +0000 UTC m=+716.802899890" Dec 05 12:00:32 crc kubenswrapper[4763]: I1205 12:00:32.314447 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" podStartSLOduration=2.158189277 podStartE2EDuration="5.314430056s" podCreationTimestamp="2025-12-05 12:00:27 +0000 UTC" firstStartedPulling="2025-12-05 12:00:28.531464923 +0000 UTC m=+713.024179656" lastFinishedPulling="2025-12-05 12:00:31.687705712 +0000 UTC m=+716.180420435" observedRunningTime="2025-12-05 12:00:32.308499582 +0000 UTC m=+716.801214315" watchObservedRunningTime="2025-12-05 12:00:32.314430056 +0000 UTC m=+716.807144779" Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.544566 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.544901 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966448 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbr2p"] Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966878 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-controller" containerID="cri-o://28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966918 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="northd" containerID="cri-o://cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966945 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-node" containerID="cri-o://eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966963 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966977 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="sbdb" containerID="cri-o://4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.966989 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="nbdb" containerID="cri-o://4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" gracePeriod=30 Dec 05 12:00:37 crc kubenswrapper[4763]: I1205 12:00:37.967006 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-acl-logging" containerID="cri-o://b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" gracePeriod=30 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.002772 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" containerID="cri-o://b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" gracePeriod=30 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.033224 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-5x8rx" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.255728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/3.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.258356 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovn-acl-logging/0.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.259214 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovn-controller/0.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.259741 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.324889 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovnkube-controller/3.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326677 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lrn9c"] Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326879 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326896 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-node" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326910 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-node" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326919 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326926 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326934 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326940 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326949 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="northd" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326955 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="northd" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326964 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-acl-logging" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326970 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-acl-logging" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326977 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="nbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326983 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="nbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.326991 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.326997 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.327005 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327011 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.327019 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kubecfg-setup" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327024 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kubecfg-setup" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.327031 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="sbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327037 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="sbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327122 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="sbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327134 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="northd" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327140 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327146 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327152 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327160 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-acl-logging" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327167 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="kube-rbac-proxy-node" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327174 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="nbdb" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327181 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327189 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327199 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovn-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327208 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.327330 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327340 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.327508 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.327518 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a5472-7487-4146-87a1-b83999821399" containerName="ovnkube-controller" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.329239 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.332592 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovn-acl-logging/0.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.334728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbr2p_b42a5472-7487-4146-87a1-b83999821399/ovn-controller/0.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335246 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335428 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335489 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335628 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335688 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335743 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" exitCode=0 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335811 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" exitCode=143 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335866 4763 generic.go:334] "Generic (PLEG): container finished" podID="b42a5472-7487-4146-87a1-b83999821399" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" exitCode=143 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336154 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336276 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336337 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336396 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336486 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336541 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.336595 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.337479 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.337863 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.338146 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.338450 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.338817 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339009 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339121 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339579 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339649 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339899 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.340025 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.340134 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.340216 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.340338 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.335612 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.337141 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.340396 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341154 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341167 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341174 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341190 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341195 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341201 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341208 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341214 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341221 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341227 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbr2p" event={"ID":"b42a5472-7487-4146-87a1-b83999821399","Type":"ContainerDied","Data":"6bb60da1b1fd6adf6e9fec7d889a865b4e5ac81db0f9ddf11613c9813d4807a3"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341250 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341257 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341263 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341270 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341277 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341284 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341290 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341297 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341305 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.341312 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.339471 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/2.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.342678 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/1.log" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.342780 4763 generic.go:334] "Generic (PLEG): container finished" podID="737ae453-c22e-41ea-a10e-7e8f1f165467" containerID="6c5a2cf91a9ab67900794000630b79f7786d85e8c1fe93bc6af0f40b8aca502e" exitCode=2 Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.342855 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerDied","Data":"6c5a2cf91a9ab67900794000630b79f7786d85e8c1fe93bc6af0f40b8aca502e"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.342890 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d"} Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.343242 4763 scope.go:117] "RemoveContainer" containerID="6c5a2cf91a9ab67900794000630b79f7786d85e8c1fe93bc6af0f40b8aca502e" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.343406 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kwkp4_openshift-multus(737ae453-c22e-41ea-a10e-7e8f1f165467)\"" pod="openshift-multus/multus-kwkp4" podUID="737ae453-c22e-41ea-a10e-7e8f1f165467" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.359016 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.382851 4763 scope.go:117] "RemoveContainer" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.398159 4763 scope.go:117] "RemoveContainer" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.409749 4763 scope.go:117] "RemoveContainer" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.422206 4763 scope.go:117] "RemoveContainer" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.430825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.430873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.430896 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.430916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431008 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431068 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431139 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431154 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431191 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431204 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431600 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431621 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nq2w\" (UniqueName: \"kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib\") pod \"b42a5472-7487-4146-87a1-b83999821399\" (UID: \"b42a5472-7487-4146-87a1-b83999821399\") " Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431878 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxv6\" (UniqueName: \"kubernetes.io/projected/2e2386f3-cc31-4040-ab31-80eb368ea6b8-kube-api-access-5jxv6\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.430996 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431469 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.431998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket" (OuterVolumeSpecName: "log-socket") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432108 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432138 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432166 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log" (OuterVolumeSpecName: "node-log") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432178 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432191 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432450 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432499 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash" (OuterVolumeSpecName: "host-slash") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-var-lib-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-netns\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432693 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-ovn\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432804 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-slash\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-etc-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-script-lib\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-systemd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-kubelet\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-env-overrides\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432951 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-config\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.432981 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-bin\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433006 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-netd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433114 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-node-log\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433168 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-log-socket\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-systemd-units\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433510 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433530 4763 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433539 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433575 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433658 4763 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433693 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433708 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433726 4763 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433738 4763 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433753 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433839 4763 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433852 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433865 4763 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433879 4763 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433896 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433907 4763 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.433921 4763 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.436438 4763 scope.go:117] "RemoveContainer" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.436752 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.436822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w" (OuterVolumeSpecName: "kube-api-access-6nq2w") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "kube-api-access-6nq2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.444637 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b42a5472-7487-4146-87a1-b83999821399" (UID: "b42a5472-7487-4146-87a1-b83999821399"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.448380 4763 scope.go:117] "RemoveContainer" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.461163 4763 scope.go:117] "RemoveContainer" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.474147 4763 scope.go:117] "RemoveContainer" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.485440 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.485818 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.485929 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} err="failed to get container status \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.486027 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.486351 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": container with ID starting with 93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba not found: ID does not exist" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.486406 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} err="failed to get container status \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": rpc error: code = NotFound desc = could not find container \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": container with ID starting with 93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.486439 4763 scope.go:117] "RemoveContainer" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.486727 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": container with ID starting with 4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc not found: ID does not exist" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.486751 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} err="failed to get container status \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": rpc error: code = NotFound desc = could not find container \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": container with ID starting with 4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.486777 4763 scope.go:117] "RemoveContainer" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.487061 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": container with ID starting with 4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05 not found: ID does not exist" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487100 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} err="failed to get container status \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": rpc error: code = NotFound desc = could not find container \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": container with ID starting with 4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487123 4763 scope.go:117] "RemoveContainer" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.487310 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": container with ID starting with cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13 not found: ID does not exist" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487405 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} err="failed to get container status \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": rpc error: code = NotFound desc = could not find container \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": container with ID starting with cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487529 4763 scope.go:117] "RemoveContainer" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.487857 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": container with ID starting with aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01 not found: ID does not exist" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487883 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} err="failed to get container status \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": rpc error: code = NotFound desc = could not find container \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": container with ID starting with aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.487902 4763 scope.go:117] "RemoveContainer" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.488185 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": container with ID starting with eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff not found: ID does not exist" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.488231 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} err="failed to get container status \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": rpc error: code = NotFound desc = could not find container \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": container with ID starting with eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.488250 4763 scope.go:117] "RemoveContainer" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.488514 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": container with ID starting with b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6 not found: ID does not exist" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.488685 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} err="failed to get container status \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": rpc error: code = NotFound desc = could not find container \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": container with ID starting with b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.488745 4763 scope.go:117] "RemoveContainer" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.489129 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": container with ID starting with 28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351 not found: ID does not exist" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.489209 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} err="failed to get container status \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": rpc error: code = NotFound desc = could not find container \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": container with ID starting with 28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.489274 4763 scope.go:117] "RemoveContainer" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: E1205 12:00:38.489621 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": container with ID starting with 660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134 not found: ID does not exist" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.489707 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} err="failed to get container status \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": rpc error: code = NotFound desc = could not find container \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": container with ID starting with 660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.489809 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490059 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} err="failed to get container status \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490083 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490368 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} err="failed to get container status \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": rpc error: code = NotFound desc = could not find container \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": container with ID starting with 93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490456 4763 scope.go:117] "RemoveContainer" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490717 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} err="failed to get container status \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": rpc error: code = NotFound desc = could not find container \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": container with ID starting with 4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.490849 4763 scope.go:117] "RemoveContainer" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.491258 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} err="failed to get container status \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": rpc error: code = NotFound desc = could not find container \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": container with ID starting with 4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.491345 4763 scope.go:117] "RemoveContainer" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.491625 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} err="failed to get container status \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": rpc error: code = NotFound desc = could not find container \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": container with ID starting with cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.491660 4763 scope.go:117] "RemoveContainer" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492060 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} err="failed to get container status \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": rpc error: code = NotFound desc = could not find container \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": container with ID starting with aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492078 4763 scope.go:117] "RemoveContainer" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492361 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} err="failed to get container status \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": rpc error: code = NotFound desc = could not find container \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": container with ID starting with eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492383 4763 scope.go:117] "RemoveContainer" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492803 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} err="failed to get container status \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": rpc error: code = NotFound desc = could not find container \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": container with ID starting with b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.492904 4763 scope.go:117] "RemoveContainer" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.493159 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} err="failed to get container status \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": rpc error: code = NotFound desc = could not find container \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": container with ID starting with 28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.493181 4763 scope.go:117] "RemoveContainer" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.493520 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} err="failed to get container status \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": rpc error: code = NotFound desc = could not find container \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": container with ID starting with 660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.493645 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494131 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} err="failed to get container status \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494226 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494595 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} err="failed to get container status \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": rpc error: code = NotFound desc = could not find container \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": container with ID starting with 93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494612 4763 scope.go:117] "RemoveContainer" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494833 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} err="failed to get container status \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": rpc error: code = NotFound desc = could not find container \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": container with ID starting with 4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.494868 4763 scope.go:117] "RemoveContainer" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.495175 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} err="failed to get container status \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": rpc error: code = NotFound desc = could not find container \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": container with ID starting with 4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.495202 4763 scope.go:117] "RemoveContainer" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.495900 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} err="failed to get container status \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": rpc error: code = NotFound desc = could not find container \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": container with ID starting with cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.495925 4763 scope.go:117] "RemoveContainer" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496260 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} err="failed to get container status \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": rpc error: code = NotFound desc = could not find container \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": container with ID starting with aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496279 4763 scope.go:117] "RemoveContainer" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496507 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} err="failed to get container status \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": rpc error: code = NotFound desc = could not find container \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": container with ID starting with eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496607 4763 scope.go:117] "RemoveContainer" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496910 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} err="failed to get container status \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": rpc error: code = NotFound desc = could not find container \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": container with ID starting with b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.496932 4763 scope.go:117] "RemoveContainer" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497253 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} err="failed to get container status \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": rpc error: code = NotFound desc = could not find container \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": container with ID starting with 28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497277 4763 scope.go:117] "RemoveContainer" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497527 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} err="failed to get container status \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": rpc error: code = NotFound desc = could not find container \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": container with ID starting with 660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497544 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497786 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} err="failed to get container status \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.497920 4763 scope.go:117] "RemoveContainer" containerID="93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.498249 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba"} err="failed to get container status \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": rpc error: code = NotFound desc = could not find container \"93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba\": container with ID starting with 93ce5fa785d1ff2598ce1df34c5b549cb2cc1963773e55f3e1e1dea2e187b7ba not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.498342 4763 scope.go:117] "RemoveContainer" containerID="4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.498625 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc"} err="failed to get container status \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": rpc error: code = NotFound desc = could not find container \"4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc\": container with ID starting with 4363e396212fe64b3c5365e81e048f248af00a5b225574ae0d81f46097a629cc not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.498720 4763 scope.go:117] "RemoveContainer" containerID="4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499150 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05"} err="failed to get container status \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": rpc error: code = NotFound desc = could not find container \"4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05\": container with ID starting with 4a1e6ea4ec0fae4991d24d83bbd521051c94ac6c5afbe81fb476e87dc6774d05 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499173 4763 scope.go:117] "RemoveContainer" containerID="cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499450 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13"} err="failed to get container status \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": rpc error: code = NotFound desc = could not find container \"cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13\": container with ID starting with cf2a2b7227147c864133f887c9c40796eb3f8d9c57944bd045d24a4a025f9d13 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499536 4763 scope.go:117] "RemoveContainer" containerID="aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499863 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01"} err="failed to get container status \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": rpc error: code = NotFound desc = could not find container \"aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01\": container with ID starting with aea4628843677d03eb0a78693a1b01951ec31c3d2624a8763eb7896fb5878f01 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.499892 4763 scope.go:117] "RemoveContainer" containerID="eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.500505 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff"} err="failed to get container status \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": rpc error: code = NotFound desc = could not find container \"eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff\": container with ID starting with eff68018c489abf63575a8db806d89b45f4d02d096eef5a9b78f3284fab344ff not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.500593 4763 scope.go:117] "RemoveContainer" containerID="b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.500899 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6"} err="failed to get container status \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": rpc error: code = NotFound desc = could not find container \"b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6\": container with ID starting with b8c8c71d0838594ffd747275c27576bacf46c8eea898f95ede7f72714f3725a6 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.500981 4763 scope.go:117] "RemoveContainer" containerID="28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.501233 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351"} err="failed to get container status \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": rpc error: code = NotFound desc = could not find container \"28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351\": container with ID starting with 28239ee7c9e42bf7a97ecd63ef90ed26d5982786fe3f26d2fcbe71013b702351 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.501268 4763 scope.go:117] "RemoveContainer" containerID="660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.501541 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134"} err="failed to get container status \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": rpc error: code = NotFound desc = could not find container \"660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134\": container with ID starting with 660aeb69ecbc0755eaf5b7220b8d53f1ccf6facf7a509b8f79b2093463bf0134 not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.501561 4763 scope.go:117] "RemoveContainer" containerID="b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.501829 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b"} err="failed to get container status \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": rpc error: code = NotFound desc = could not find container \"b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b\": container with ID starting with b3fc27cc1da481ddaa2ea4cfbd2bc5d4bcc918ba8e2ebab75436094c247f255b not found: ID does not exist" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.534756 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxv6\" (UniqueName: \"kubernetes.io/projected/2e2386f3-cc31-4040-ab31-80eb368ea6b8-kube-api-access-5jxv6\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.534887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.534938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-var-lib-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.534973 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-netns\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535106 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-ovn\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-slash\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-var-lib-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-etc-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-etc-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-systemd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-systemd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-script-lib\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535298 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-kubelet\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535332 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-run-netns\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535344 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-env-overrides\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-openvswitch\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535395 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-slash\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-config\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-run-ovn\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535449 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-bin\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-kubelet\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-netd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-node-log\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-log-socket\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535598 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-systemd-units\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535659 4763 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b42a5472-7487-4146-87a1-b83999821399-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535681 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nq2w\" (UniqueName: \"kubernetes.io/projected/b42a5472-7487-4146-87a1-b83999821399-kube-api-access-6nq2w\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535700 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535719 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b42a5472-7487-4146-87a1-b83999821399-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535737 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b42a5472-7487-4146-87a1-b83999821399-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-netd\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535836 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-node-log\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.535966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-systemd-units\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.536000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-host-cni-bin\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.536024 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e2386f3-cc31-4040-ab31-80eb368ea6b8-log-socket\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.536230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-env-overrides\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.536230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-script-lib\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.536813 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovnkube-config\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.539132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e2386f3-cc31-4040-ab31-80eb368ea6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.549920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxv6\" (UniqueName: \"kubernetes.io/projected/2e2386f3-cc31-4040-ab31-80eb368ea6b8-kube-api-access-5jxv6\") pod \"ovnkube-node-lrn9c\" (UID: \"2e2386f3-cc31-4040-ab31-80eb368ea6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.646528 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.690955 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbr2p"] Dec 05 12:00:38 crc kubenswrapper[4763]: I1205 12:00:38.702164 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbr2p"] Dec 05 12:00:39 crc kubenswrapper[4763]: I1205 12:00:39.349512 4763 generic.go:334] "Generic (PLEG): container finished" podID="2e2386f3-cc31-4040-ab31-80eb368ea6b8" containerID="b4c8380e9e798be7abf5dfd7f69df4434b31f60dca048be550f0138780fdf2e3" exitCode=0 Dec 05 12:00:39 crc kubenswrapper[4763]: I1205 12:00:39.349583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerDied","Data":"b4c8380e9e798be7abf5dfd7f69df4434b31f60dca048be550f0138780fdf2e3"} Dec 05 12:00:39 crc kubenswrapper[4763]: I1205 12:00:39.349626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"174e94976139169dacb444992730c73be9727d0c671be0e6ddade3920a6dd3e6"} Dec 05 12:00:39 crc kubenswrapper[4763]: I1205 12:00:39.795219 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42a5472-7487-4146-87a1-b83999821399" path="/var/lib/kubelet/pods/b42a5472-7487-4146-87a1-b83999821399/volumes" Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"7902557cddeb25c7659a4e85d8fc6793ac6061b20a1b843e8f98f99a3b457ec7"} Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"60daf385f4f9185f4fcb7f68c58a6cd7fd5d0440619e27118e69f36eba83b53f"} Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"316263db8b57aef19873f7b5ee86c4ab376337b8307e4d4efef31fc9a40cfc9c"} Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"b80e7a8e2d7da32b5ba3d6589e8b17e733b8314ec4d50ad97008bc33090961c7"} Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"9aec656074622d1ffd292571de4ecc07e8a0ad046597533774dc4e7fbe6a9207"} Dec 05 12:00:40 crc kubenswrapper[4763]: I1205 12:00:40.359591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"903f38361f05284848ec6ca19ca2b8db2e0364c08a73a32ec1f46a529e1bc7bc"} Dec 05 12:00:41 crc kubenswrapper[4763]: I1205 12:00:41.370228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"116e3024a86cac84f70f7fca88f2ce9a1831a28b1ce5cb20e206056c73bfafbe"} Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.387361 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" event={"ID":"2e2386f3-cc31-4040-ab31-80eb368ea6b8","Type":"ContainerStarted","Data":"eb5f362316e8629fa01cde5fd9a1b4e80e94f38fbfd7c890de22cbb608574ea5"} Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.387782 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.387841 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.388880 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.413426 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" podStartSLOduration=5.413405303 podStartE2EDuration="5.413405303s" podCreationTimestamp="2025-12-05 12:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:00:43.413207916 +0000 UTC m=+727.905922719" watchObservedRunningTime="2025-12-05 12:00:43.413405303 +0000 UTC m=+727.906120026" Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.414817 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:43 crc kubenswrapper[4763]: I1205 12:00:43.416697 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:00:50 crc kubenswrapper[4763]: I1205 12:00:50.784127 4763 scope.go:117] "RemoveContainer" containerID="6c5a2cf91a9ab67900794000630b79f7786d85e8c1fe93bc6af0f40b8aca502e" Dec 05 12:00:52 crc kubenswrapper[4763]: I1205 12:00:52.440660 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/2.log" Dec 05 12:00:52 crc kubenswrapper[4763]: I1205 12:00:52.441524 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/1.log" Dec 05 12:00:52 crc kubenswrapper[4763]: I1205 12:00:52.441590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwkp4" event={"ID":"737ae453-c22e-41ea-a10e-7e8f1f165467","Type":"ContainerStarted","Data":"d5f1f8bec34d345c0f1171b35d9eb092a0d417794bb1224e6ee2933114639656"} Dec 05 12:01:07 crc kubenswrapper[4763]: I1205 12:01:07.544541 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:01:07 crc kubenswrapper[4763]: I1205 12:01:07.545582 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.289299 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9"] Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.291019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.296328 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.296665 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9"] Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.331079 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.333210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wsc\" (UniqueName: \"kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.333304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.434601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wsc\" (UniqueName: \"kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.434914 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.435049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.435532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.435608 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.464121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wsc\" (UniqueName: \"kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.615362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.685156 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrn9c" Dec 05 12:01:08 crc kubenswrapper[4763]: I1205 12:01:08.834579 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9"] Dec 05 12:01:09 crc kubenswrapper[4763]: I1205 12:01:09.526336 4763 generic.go:334] "Generic (PLEG): container finished" podID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerID="cdd301d112619a5eaaf56b9cc82a5786d934db7ff12a771151bb46ea9d483bba" exitCode=0 Dec 05 12:01:09 crc kubenswrapper[4763]: I1205 12:01:09.526474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" event={"ID":"552de8ea-aa26-40d4-a360-1eda3664ae62","Type":"ContainerDied","Data":"cdd301d112619a5eaaf56b9cc82a5786d934db7ff12a771151bb46ea9d483bba"} Dec 05 12:01:09 crc kubenswrapper[4763]: I1205 12:01:09.526659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" event={"ID":"552de8ea-aa26-40d4-a360-1eda3664ae62","Type":"ContainerStarted","Data":"26b5ff7115e68a6dcc4cb7f58f95a0d2adbe8ad15b3e7a290526a4ed37efc5e5"} Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.548836 4763 generic.go:334] "Generic (PLEG): container finished" podID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerID="5fd0a5bec29cb85ddd86120c5dfd27d6126b6eec1c7a2e156adb96b682b1a37c" exitCode=0 Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.548911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" event={"ID":"552de8ea-aa26-40d4-a360-1eda3664ae62","Type":"ContainerDied","Data":"5fd0a5bec29cb85ddd86120c5dfd27d6126b6eec1c7a2e156adb96b682b1a37c"} Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.605007 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.607522 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.622496 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.697527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgsz\" (UniqueName: \"kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.697608 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.697675 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.798895 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgsz\" (UniqueName: \"kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.799029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.799114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.800139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.800144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.819156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgsz\" (UniqueName: \"kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz\") pod \"redhat-operators-w8s98\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:12 crc kubenswrapper[4763]: I1205 12:01:12.958623 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.190787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.556170 4763 generic.go:334] "Generic (PLEG): container finished" podID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerID="2e2b1144c38658996c405328e0b13ed477fea125f64939c49de12da7bb3714c3" exitCode=0 Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.556233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" event={"ID":"552de8ea-aa26-40d4-a360-1eda3664ae62","Type":"ContainerDied","Data":"2e2b1144c38658996c405328e0b13ed477fea125f64939c49de12da7bb3714c3"} Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.559443 4763 generic.go:334] "Generic (PLEG): container finished" podID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerID="585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf" exitCode=0 Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.559521 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerDied","Data":"585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf"} Dec 05 12:01:13 crc kubenswrapper[4763]: I1205 12:01:13.559551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerStarted","Data":"93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34"} Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.826683 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.928917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util\") pod \"552de8ea-aa26-40d4-a360-1eda3664ae62\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.929049 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle\") pod \"552de8ea-aa26-40d4-a360-1eda3664ae62\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.929236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wsc\" (UniqueName: \"kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc\") pod \"552de8ea-aa26-40d4-a360-1eda3664ae62\" (UID: \"552de8ea-aa26-40d4-a360-1eda3664ae62\") " Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.930928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle" (OuterVolumeSpecName: "bundle") pod "552de8ea-aa26-40d4-a360-1eda3664ae62" (UID: "552de8ea-aa26-40d4-a360-1eda3664ae62"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.937822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc" (OuterVolumeSpecName: "kube-api-access-h9wsc") pod "552de8ea-aa26-40d4-a360-1eda3664ae62" (UID: "552de8ea-aa26-40d4-a360-1eda3664ae62"). InnerVolumeSpecName "kube-api-access-h9wsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:01:14 crc kubenswrapper[4763]: I1205 12:01:14.943319 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util" (OuterVolumeSpecName: "util") pod "552de8ea-aa26-40d4-a360-1eda3664ae62" (UID: "552de8ea-aa26-40d4-a360-1eda3664ae62"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.032650 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wsc\" (UniqueName: \"kubernetes.io/projected/552de8ea-aa26-40d4-a360-1eda3664ae62-kube-api-access-h9wsc\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.032710 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-util\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.032726 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/552de8ea-aa26-40d4-a360-1eda3664ae62-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.129987 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.573684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerStarted","Data":"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86"} Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.577714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" event={"ID":"552de8ea-aa26-40d4-a360-1eda3664ae62","Type":"ContainerDied","Data":"26b5ff7115e68a6dcc4cb7f58f95a0d2adbe8ad15b3e7a290526a4ed37efc5e5"} Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.577790 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26b5ff7115e68a6dcc4cb7f58f95a0d2adbe8ad15b3e7a290526a4ed37efc5e5" Dec 05 12:01:15 crc kubenswrapper[4763]: I1205 12:01:15.577750 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9" Dec 05 12:01:16 crc kubenswrapper[4763]: I1205 12:01:16.585208 4763 generic.go:334] "Generic (PLEG): container finished" podID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerID="4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86" exitCode=0 Dec 05 12:01:16 crc kubenswrapper[4763]: I1205 12:01:16.585258 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerDied","Data":"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86"} Dec 05 12:01:17 crc kubenswrapper[4763]: I1205 12:01:17.593692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerStarted","Data":"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f"} Dec 05 12:01:17 crc kubenswrapper[4763]: I1205 12:01:17.617070 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8s98" podStartSLOduration=2.21373358 podStartE2EDuration="5.61705253s" podCreationTimestamp="2025-12-05 12:01:12 +0000 UTC" firstStartedPulling="2025-12-05 12:01:13.561167471 +0000 UTC m=+758.053882194" lastFinishedPulling="2025-12-05 12:01:16.964486411 +0000 UTC m=+761.457201144" observedRunningTime="2025-12-05 12:01:17.614798483 +0000 UTC m=+762.107513226" watchObservedRunningTime="2025-12-05 12:01:17.61705253 +0000 UTC m=+762.109767253" Dec 05 12:01:22 crc kubenswrapper[4763]: I1205 12:01:22.959512 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:22 crc kubenswrapper[4763]: I1205 12:01:22.959944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:24 crc kubenswrapper[4763]: I1205 12:01:24.015931 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w8s98" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="registry-server" probeResult="failure" output=< Dec 05 12:01:24 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 12:01:24 crc kubenswrapper[4763]: > Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.556479 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr"] Dec 05 12:01:27 crc kubenswrapper[4763]: E1205 12:01:27.557055 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="pull" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.557070 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="pull" Dec 05 12:01:27 crc kubenswrapper[4763]: E1205 12:01:27.557097 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="util" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.557105 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="util" Dec 05 12:01:27 crc kubenswrapper[4763]: E1205 12:01:27.557119 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="extract" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.557127 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="extract" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.557241 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="552de8ea-aa26-40d4-a360-1eda3664ae62" containerName="extract" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.557716 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.559850 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.561697 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-lzv9m" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.561719 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.569085 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.595817 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zftjf\" (UniqueName: \"kubernetes.io/projected/e1ccdc7d-9781-4086-b0a7-7a777c943bcb-kube-api-access-zftjf\") pod \"obo-prometheus-operator-668cf9dfbb-zsfpr\" (UID: \"e1ccdc7d-9781-4086-b0a7-7a777c943bcb\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.682159 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.682831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.685062 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.685109 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-r8f7c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.693772 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.694616 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.697159 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zftjf\" (UniqueName: \"kubernetes.io/projected/e1ccdc7d-9781-4086-b0a7-7a777c943bcb-kube-api-access-zftjf\") pod \"obo-prometheus-operator-668cf9dfbb-zsfpr\" (UID: \"e1ccdc7d-9781-4086-b0a7-7a777c943bcb\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.697247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.697283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.720575 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.746677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zftjf\" (UniqueName: \"kubernetes.io/projected/e1ccdc7d-9781-4086-b0a7-7a777c943bcb-kube-api-access-zftjf\") pod \"obo-prometheus-operator-668cf9dfbb-zsfpr\" (UID: \"e1ccdc7d-9781-4086-b0a7-7a777c943bcb\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.751302 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.798554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.798602 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.798625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.799402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.806524 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.811218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25a7208-54a3-4a23-a355-8bbd34b81ace-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm\" (UID: \"e25a7208-54a3-4a23-a355-8bbd34b81ace\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.874993 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.889004 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-djkw9"] Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.889753 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.894935 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-cqs9x" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.900949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.901010 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8052a23-847b-4419-af86-e56c327c367b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.901058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.901091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gmf\" (UniqueName: \"kubernetes.io/projected/f8052a23-847b-4419-af86-e56c327c367b-kube-api-access-69gmf\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.901618 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.905309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.908884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e723fc3f-3161-40a0-becd-a17210dbd266-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c\" (UID: \"e723fc3f-3161-40a0-becd-a17210dbd266\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:27 crc kubenswrapper[4763]: I1205 12:01:27.956816 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-djkw9"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.012653 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.013425 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69gmf\" (UniqueName: \"kubernetes.io/projected/f8052a23-847b-4419-af86-e56c327c367b-kube-api-access-69gmf\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.013487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8052a23-847b-4419-af86-e56c327c367b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.013986 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.017555 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8052a23-847b-4419-af86-e56c327c367b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.041572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gmf\" (UniqueName: \"kubernetes.io/projected/f8052a23-847b-4419-af86-e56c327c367b-kube-api-access-69gmf\") pod \"observability-operator-d8bb48f5d-djkw9\" (UID: \"f8052a23-847b-4419-af86-e56c327c367b\") " pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.128276 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zx5r7"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.129119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.134215 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-ztwqg" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.135876 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zx5r7"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.225442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x866h\" (UniqueName: \"kubernetes.io/projected/1603ef68-55d9-49dc-bbe4-93b129fe1b29-kube-api-access-x866h\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.225523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1603ef68-55d9-49dc-bbe4-93b129fe1b29-openshift-service-ca\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.324145 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.326841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x866h\" (UniqueName: \"kubernetes.io/projected/1603ef68-55d9-49dc-bbe4-93b129fe1b29-kube-api-access-x866h\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.326944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1603ef68-55d9-49dc-bbe4-93b129fe1b29-openshift-service-ca\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.327961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1603ef68-55d9-49dc-bbe4-93b129fe1b29-openshift-service-ca\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.357556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x866h\" (UniqueName: \"kubernetes.io/projected/1603ef68-55d9-49dc-bbe4-93b129fe1b29-kube-api-access-x866h\") pod \"perses-operator-5446b9c989-zx5r7\" (UID: \"1603ef68-55d9-49dc-bbe4-93b129fe1b29\") " pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.514772 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.592731 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.645513 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.676309 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm"] Dec 05 12:01:28 crc kubenswrapper[4763]: W1205 12:01:28.677522 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25a7208_54a3_4a23_a355_8bbd34b81ace.slice/crio-460a84ee1b7dc978801dea078d6a91da6c962770a603d9470456575bed37a82b WatchSource:0}: Error finding container 460a84ee1b7dc978801dea078d6a91da6c962770a603d9470456575bed37a82b: Status 404 returned error can't find the container with id 460a84ee1b7dc978801dea078d6a91da6c962770a603d9470456575bed37a82b Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.692158 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" event={"ID":"e1ccdc7d-9781-4086-b0a7-7a777c943bcb","Type":"ContainerStarted","Data":"6fd37c72adee23cb4b12d4311f5e2424ada9228678d7e17a6ba592a3d5b41e69"} Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.696186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" event={"ID":"e723fc3f-3161-40a0-becd-a17210dbd266","Type":"ContainerStarted","Data":"3dcf95cb0c55c440481d6b4894bd47d1e5260e55e1b56a6dc0ae595eee664092"} Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.753700 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-djkw9"] Dec 05 12:01:28 crc kubenswrapper[4763]: I1205 12:01:28.859748 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zx5r7"] Dec 05 12:01:28 crc kubenswrapper[4763]: W1205 12:01:28.876923 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1603ef68_55d9_49dc_bbe4_93b129fe1b29.slice/crio-e24d28baf1d558962e5cc7f924e037e4ae1c42238976b315957927bc09fca9e0 WatchSource:0}: Error finding container e24d28baf1d558962e5cc7f924e037e4ae1c42238976b315957927bc09fca9e0: Status 404 returned error can't find the container with id e24d28baf1d558962e5cc7f924e037e4ae1c42238976b315957927bc09fca9e0 Dec 05 12:01:29 crc kubenswrapper[4763]: I1205 12:01:29.702654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" event={"ID":"1603ef68-55d9-49dc-bbe4-93b129fe1b29","Type":"ContainerStarted","Data":"e24d28baf1d558962e5cc7f924e037e4ae1c42238976b315957927bc09fca9e0"} Dec 05 12:01:29 crc kubenswrapper[4763]: I1205 12:01:29.703968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" event={"ID":"f8052a23-847b-4419-af86-e56c327c367b","Type":"ContainerStarted","Data":"711cd3ee8586afb63fc6a2bcd25577b56b6537d167335b7bdfe257dc629d8a84"} Dec 05 12:01:29 crc kubenswrapper[4763]: I1205 12:01:29.705128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" event={"ID":"e25a7208-54a3-4a23-a355-8bbd34b81ace","Type":"ContainerStarted","Data":"460a84ee1b7dc978801dea078d6a91da6c962770a603d9470456575bed37a82b"} Dec 05 12:01:33 crc kubenswrapper[4763]: I1205 12:01:33.067873 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:33 crc kubenswrapper[4763]: I1205 12:01:33.184411 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:33 crc kubenswrapper[4763]: I1205 12:01:33.994690 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:34 crc kubenswrapper[4763]: I1205 12:01:34.747423 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8s98" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="registry-server" containerID="cri-o://0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f" gracePeriod=2 Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.198195 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.371889 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content\") pod \"9730875a-8f84-4f57-9f50-f83c08291fa7\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.371990 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities\") pod \"9730875a-8f84-4f57-9f50-f83c08291fa7\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.372042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbgsz\" (UniqueName: \"kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz\") pod \"9730875a-8f84-4f57-9f50-f83c08291fa7\" (UID: \"9730875a-8f84-4f57-9f50-f83c08291fa7\") " Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.373338 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities" (OuterVolumeSpecName: "utilities") pod "9730875a-8f84-4f57-9f50-f83c08291fa7" (UID: "9730875a-8f84-4f57-9f50-f83c08291fa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.377263 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz" (OuterVolumeSpecName: "kube-api-access-xbgsz") pod "9730875a-8f84-4f57-9f50-f83c08291fa7" (UID: "9730875a-8f84-4f57-9f50-f83c08291fa7"). InnerVolumeSpecName "kube-api-access-xbgsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.473419 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.473456 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbgsz\" (UniqueName: \"kubernetes.io/projected/9730875a-8f84-4f57-9f50-f83c08291fa7-kube-api-access-xbgsz\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.494639 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9730875a-8f84-4f57-9f50-f83c08291fa7" (UID: "9730875a-8f84-4f57-9f50-f83c08291fa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.574222 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9730875a-8f84-4f57-9f50-f83c08291fa7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.762012 4763 generic.go:334] "Generic (PLEG): container finished" podID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerID="0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f" exitCode=0 Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.762059 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8s98" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.762061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerDied","Data":"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f"} Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.762102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8s98" event={"ID":"9730875a-8f84-4f57-9f50-f83c08291fa7","Type":"ContainerDied","Data":"93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34"} Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.762122 4763 scope.go:117] "RemoveContainer" containerID="0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.800672 4763 scope.go:117] "RemoveContainer" containerID="4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.876914 4763 scope.go:117] "RemoveContainer" containerID="585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.902968 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.905221 4763 scope.go:117] "RemoveContainer" containerID="0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f" Dec 05 12:01:35 crc kubenswrapper[4763]: E1205 12:01:35.905769 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f\": container with ID starting with 0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f not found: ID does not exist" containerID="0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.905815 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f"} err="failed to get container status \"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f\": rpc error: code = NotFound desc = could not find container \"0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f\": container with ID starting with 0fd7dbe0a49310a4b9f3212d5157e3e0bd6811b71d816c96291a399d736ec42f not found: ID does not exist" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.905844 4763 scope.go:117] "RemoveContainer" containerID="4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86" Dec 05 12:01:35 crc kubenswrapper[4763]: E1205 12:01:35.906472 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86\": container with ID starting with 4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86 not found: ID does not exist" containerID="4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.906530 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86"} err="failed to get container status \"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86\": rpc error: code = NotFound desc = could not find container \"4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86\": container with ID starting with 4d16eab0f1e7406b9e76bdfad0f3930a774592f4239532987a616f570af6ef86 not found: ID does not exist" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.906557 4763 scope.go:117] "RemoveContainer" containerID="585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.909274 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8s98"] Dec 05 12:01:35 crc kubenswrapper[4763]: E1205 12:01:35.911870 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf\": container with ID starting with 585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf not found: ID does not exist" containerID="585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf" Dec 05 12:01:35 crc kubenswrapper[4763]: I1205 12:01:35.911920 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf"} err="failed to get container status \"585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf\": rpc error: code = NotFound desc = could not find container \"585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf\": container with ID starting with 585ef1387a10879cecbea6e55038e85d9c84559ce1095d5e2ef3616a005bb5bf not found: ID does not exist" Dec 05 12:01:36 crc kubenswrapper[4763]: I1205 12:01:36.057413 4763 scope.go:117] "RemoveContainer" containerID="a6e4880a3acae535beea88e11057496d98b2ddd8d2161f75c52dd4527425a83d" Dec 05 12:01:36 crc kubenswrapper[4763]: I1205 12:01:36.770460 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwkp4_737ae453-c22e-41ea-a10e-7e8f1f165467/kube-multus/2.log" Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.544115 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.544178 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.544225 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.544799 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.544852 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6" gracePeriod=600 Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.777212 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6" exitCode=0 Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.777551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6"} Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.777586 4763 scope.go:117] "RemoveContainer" containerID="c000a81bc5dd100f4b8e26b1306b8f7ef21c0b2a127a88c01cbb81f5e387628c" Dec 05 12:01:37 crc kubenswrapper[4763]: I1205 12:01:37.794403 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" path="/var/lib/kubelet/pods/9730875a-8f84-4f57-9f50-f83c08291fa7/volumes" Dec 05 12:01:45 crc kubenswrapper[4763]: E1205 12:01:45.154910 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache]" Dec 05 12:01:48 crc kubenswrapper[4763]: E1205 12:01:48.455156 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 05 12:01:48 crc kubenswrapper[4763]: E1205 12:01:48.455665 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-69gmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-djkw9_openshift-operators(f8052a23-847b-4419-af86-e56c327c367b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 12:01:48 crc kubenswrapper[4763]: E1205 12:01:48.457102 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" podUID="f8052a23-847b-4419-af86-e56c327c367b" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.870471 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" event={"ID":"1603ef68-55d9-49dc-bbe4-93b129fe1b29","Type":"ContainerStarted","Data":"145a5b5b88deae029f25df08df4a812185b0712a6086fce65feacfdd0e5683b5"} Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.870837 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.875836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" event={"ID":"e723fc3f-3161-40a0-becd-a17210dbd266","Type":"ContainerStarted","Data":"cccd132759c27fa643a3ef42ba487a281c55a38fe2d7a4412f193a90396fffac"} Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.879220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" event={"ID":"e1ccdc7d-9781-4086-b0a7-7a777c943bcb","Type":"ContainerStarted","Data":"1e8bd365b3b21c746639ac35a09bb64f69ba8b96a2c6c680409e213765870c1c"} Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.880973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" event={"ID":"e25a7208-54a3-4a23-a355-8bbd34b81ace","Type":"ContainerStarted","Data":"af942fdd8297c9dc4db0d640b62c00a80f9647a05a5e52232b594a75baa98ff0"} Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.884583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77"} Dec 05 12:01:48 crc kubenswrapper[4763]: E1205 12:01:48.885395 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" podUID="f8052a23-847b-4419-af86-e56c327c367b" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.904283 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" podStartSLOduration=1.281612128 podStartE2EDuration="20.904263168s" podCreationTimestamp="2025-12-05 12:01:28 +0000 UTC" firstStartedPulling="2025-12-05 12:01:28.878669723 +0000 UTC m=+773.371384446" lastFinishedPulling="2025-12-05 12:01:48.501320773 +0000 UTC m=+792.994035486" observedRunningTime="2025-12-05 12:01:48.892141324 +0000 UTC m=+793.384856087" watchObservedRunningTime="2025-12-05 12:01:48.904263168 +0000 UTC m=+793.396977901" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.915244 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zsfpr" podStartSLOduration=2.032076913 podStartE2EDuration="21.915227298s" podCreationTimestamp="2025-12-05 12:01:27 +0000 UTC" firstStartedPulling="2025-12-05 12:01:28.605222686 +0000 UTC m=+773.097937409" lastFinishedPulling="2025-12-05 12:01:48.488373071 +0000 UTC m=+792.981087794" observedRunningTime="2025-12-05 12:01:48.912898782 +0000 UTC m=+793.405613525" watchObservedRunningTime="2025-12-05 12:01:48.915227298 +0000 UTC m=+793.407942041" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.970146 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm" podStartSLOduration=2.137725529 podStartE2EDuration="21.970125791s" podCreationTimestamp="2025-12-05 12:01:27 +0000 UTC" firstStartedPulling="2025-12-05 12:01:28.682818038 +0000 UTC m=+773.175532761" lastFinishedPulling="2025-12-05 12:01:48.5152183 +0000 UTC m=+793.007933023" observedRunningTime="2025-12-05 12:01:48.965599516 +0000 UTC m=+793.458314259" watchObservedRunningTime="2025-12-05 12:01:48.970125791 +0000 UTC m=+793.462840534" Dec 05 12:01:48 crc kubenswrapper[4763]: I1205 12:01:48.988316 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c" podStartSLOduration=2.149997979 podStartE2EDuration="21.988293807s" podCreationTimestamp="2025-12-05 12:01:27 +0000 UTC" firstStartedPulling="2025-12-05 12:01:28.64824997 +0000 UTC m=+773.140964683" lastFinishedPulling="2025-12-05 12:01:48.486545778 +0000 UTC m=+792.979260511" observedRunningTime="2025-12-05 12:01:48.985649065 +0000 UTC m=+793.478363798" watchObservedRunningTime="2025-12-05 12:01:48.988293807 +0000 UTC m=+793.481008530" Dec 05 12:01:55 crc kubenswrapper[4763]: E1205 12:01:55.299362 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache]" Dec 05 12:01:58 crc kubenswrapper[4763]: I1205 12:01:58.518068 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-zx5r7" Dec 05 12:02:04 crc kubenswrapper[4763]: I1205 12:02:04.969110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" event={"ID":"f8052a23-847b-4419-af86-e56c327c367b","Type":"ContainerStarted","Data":"c3520ceb0e85c524a2f5f64c8bd0d922348bba4d721a0ada3d2d68ab46a68a74"} Dec 05 12:02:04 crc kubenswrapper[4763]: I1205 12:02:04.970638 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:02:04 crc kubenswrapper[4763]: I1205 12:02:04.971940 4763 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-djkw9 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.39:8081/healthz\": dial tcp 10.217.0.39:8081: connect: connection refused" start-of-body= Dec 05 12:02:04 crc kubenswrapper[4763]: I1205 12:02:04.972005 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" podUID="f8052a23-847b-4419-af86-e56c327c367b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.39:8081/healthz\": dial tcp 10.217.0.39:8081: connect: connection refused" Dec 05 12:02:04 crc kubenswrapper[4763]: I1205 12:02:04.990034 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" podStartSLOduration=2.035876789 podStartE2EDuration="37.990011621s" podCreationTimestamp="2025-12-05 12:01:27 +0000 UTC" firstStartedPulling="2025-12-05 12:01:28.797565941 +0000 UTC m=+773.290280664" lastFinishedPulling="2025-12-05 12:02:04.751700763 +0000 UTC m=+809.244415496" observedRunningTime="2025-12-05 12:02:04.986350374 +0000 UTC m=+809.479065107" watchObservedRunningTime="2025-12-05 12:02:04.990011621 +0000 UTC m=+809.482726354" Dec 05 12:02:05 crc kubenswrapper[4763]: E1205 12:02:05.454507 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache]" Dec 05 12:02:05 crc kubenswrapper[4763]: I1205 12:02:05.977072 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-djkw9" Dec 05 12:02:15 crc kubenswrapper[4763]: E1205 12:02:15.607465 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache]" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.659986 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr"] Dec 05 12:02:23 crc kubenswrapper[4763]: E1205 12:02:23.660799 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="registry-server" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.660813 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="registry-server" Dec 05 12:02:23 crc kubenswrapper[4763]: E1205 12:02:23.660823 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="extract-utilities" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.660829 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="extract-utilities" Dec 05 12:02:23 crc kubenswrapper[4763]: E1205 12:02:23.660845 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="extract-content" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.660853 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="extract-content" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.660943 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9730875a-8f84-4f57-9f50-f83c08291fa7" containerName="registry-server" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.661734 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.664664 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.670843 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr"] Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.727469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.727851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.727904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqg8g\" (UniqueName: \"kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.829466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.829522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.829567 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqg8g\" (UniqueName: \"kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.830487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.830491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.849693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqg8g\" (UniqueName: \"kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:23 crc kubenswrapper[4763]: I1205 12:02:23.982722 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:24 crc kubenswrapper[4763]: I1205 12:02:24.196262 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr"] Dec 05 12:02:25 crc kubenswrapper[4763]: I1205 12:02:25.081926 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerID="e7cef76c227df473800b9db0569fda05a694f610296f51efcaac8bb2bf25ba24" exitCode=0 Dec 05 12:02:25 crc kubenswrapper[4763]: I1205 12:02:25.082007 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" event={"ID":"b7d19c02-ed03-4e76-951f-2032e0f23c7a","Type":"ContainerDied","Data":"e7cef76c227df473800b9db0569fda05a694f610296f51efcaac8bb2bf25ba24"} Dec 05 12:02:25 crc kubenswrapper[4763]: I1205 12:02:25.083205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" event={"ID":"b7d19c02-ed03-4e76-951f-2032e0f23c7a","Type":"ContainerStarted","Data":"9377428a0169f559429546ccc17af78395a5812d63f3aa6be1a48be5054d7f2e"} Dec 05 12:02:25 crc kubenswrapper[4763]: E1205 12:02:25.783255 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache]" Dec 05 12:02:28 crc kubenswrapper[4763]: I1205 12:02:28.105297 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerID="c28d685d6d6e7b632513bfc1f7feb39ed7a4e5263d6d343a90e213b38c4ef7c1" exitCode=0 Dec 05 12:02:28 crc kubenswrapper[4763]: I1205 12:02:28.105489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" event={"ID":"b7d19c02-ed03-4e76-951f-2032e0f23c7a","Type":"ContainerDied","Data":"c28d685d6d6e7b632513bfc1f7feb39ed7a4e5263d6d343a90e213b38c4ef7c1"} Dec 05 12:02:29 crc kubenswrapper[4763]: I1205 12:02:29.113291 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerID="6e085af78e0b8880c532183dcf8239d5c7368ecbbb8822b030f3a988a018b64c" exitCode=0 Dec 05 12:02:29 crc kubenswrapper[4763]: I1205 12:02:29.113351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" event={"ID":"b7d19c02-ed03-4e76-951f-2032e0f23c7a","Type":"ContainerDied","Data":"6e085af78e0b8880c532183dcf8239d5c7368ecbbb8822b030f3a988a018b64c"} Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.351878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.422602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle\") pod \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.422681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqg8g\" (UniqueName: \"kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g\") pod \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.422741 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util\") pod \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\" (UID: \"b7d19c02-ed03-4e76-951f-2032e0f23c7a\") " Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.423560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle" (OuterVolumeSpecName: "bundle") pod "b7d19c02-ed03-4e76-951f-2032e0f23c7a" (UID: "b7d19c02-ed03-4e76-951f-2032e0f23c7a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.429952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g" (OuterVolumeSpecName: "kube-api-access-lqg8g") pod "b7d19c02-ed03-4e76-951f-2032e0f23c7a" (UID: "b7d19c02-ed03-4e76-951f-2032e0f23c7a"). InnerVolumeSpecName "kube-api-access-lqg8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.436749 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util" (OuterVolumeSpecName: "util") pod "b7d19c02-ed03-4e76-951f-2032e0f23c7a" (UID: "b7d19c02-ed03-4e76-951f-2032e0f23c7a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.524108 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqg8g\" (UniqueName: \"kubernetes.io/projected/b7d19c02-ed03-4e76-951f-2032e0f23c7a-kube-api-access-lqg8g\") on node \"crc\" DevicePath \"\"" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.524148 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-util\") on node \"crc\" DevicePath \"\"" Dec 05 12:02:30 crc kubenswrapper[4763]: I1205 12:02:30.524162 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7d19c02-ed03-4e76-951f-2032e0f23c7a-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:02:31 crc kubenswrapper[4763]: I1205 12:02:31.129248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" event={"ID":"b7d19c02-ed03-4e76-951f-2032e0f23c7a","Type":"ContainerDied","Data":"9377428a0169f559429546ccc17af78395a5812d63f3aa6be1a48be5054d7f2e"} Dec 05 12:02:31 crc kubenswrapper[4763]: I1205 12:02:31.129310 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9377428a0169f559429546ccc17af78395a5812d63f3aa6be1a48be5054d7f2e" Dec 05 12:02:31 crc kubenswrapper[4763]: I1205 12:02:31.129651 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.323562 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2"] Dec 05 12:02:32 crc kubenswrapper[4763]: E1205 12:02:32.324667 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="pull" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.324831 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="pull" Dec 05 12:02:32 crc kubenswrapper[4763]: E1205 12:02:32.324955 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="util" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.325057 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="util" Dec 05 12:02:32 crc kubenswrapper[4763]: E1205 12:02:32.325168 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="extract" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.325270 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="extract" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.325534 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d19c02-ed03-4e76-951f-2032e0f23c7a" containerName="extract" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.326142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.327983 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rd4hx" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.328105 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.328494 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.337348 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2"] Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.445795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxh9l\" (UniqueName: \"kubernetes.io/projected/a5db489a-42dd-46c0-825d-5dc7065c9f29-kube-api-access-sxh9l\") pod \"nmstate-operator-5b5b58f5c8-72lb2\" (UID: \"a5db489a-42dd-46c0-825d-5dc7065c9f29\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.547913 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxh9l\" (UniqueName: \"kubernetes.io/projected/a5db489a-42dd-46c0-825d-5dc7065c9f29-kube-api-access-sxh9l\") pod \"nmstate-operator-5b5b58f5c8-72lb2\" (UID: \"a5db489a-42dd-46c0-825d-5dc7065c9f29\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.568594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxh9l\" (UniqueName: \"kubernetes.io/projected/a5db489a-42dd-46c0-825d-5dc7065c9f29-kube-api-access-sxh9l\") pod \"nmstate-operator-5b5b58f5c8-72lb2\" (UID: \"a5db489a-42dd-46c0-825d-5dc7065c9f29\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.647447 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" Dec 05 12:02:32 crc kubenswrapper[4763]: I1205 12:02:32.913679 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2"] Dec 05 12:02:32 crc kubenswrapper[4763]: W1205 12:02:32.922986 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5db489a_42dd_46c0_825d_5dc7065c9f29.slice/crio-d7ce52797b90f3ba1a5409e2823a1a1d4ba042bc77a215040070aea26f12e8cd WatchSource:0}: Error finding container d7ce52797b90f3ba1a5409e2823a1a1d4ba042bc77a215040070aea26f12e8cd: Status 404 returned error can't find the container with id d7ce52797b90f3ba1a5409e2823a1a1d4ba042bc77a215040070aea26f12e8cd Dec 05 12:02:33 crc kubenswrapper[4763]: I1205 12:02:33.138790 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" event={"ID":"a5db489a-42dd-46c0-825d-5dc7065c9f29","Type":"ContainerStarted","Data":"d7ce52797b90f3ba1a5409e2823a1a1d4ba042bc77a215040070aea26f12e8cd"} Dec 05 12:02:35 crc kubenswrapper[4763]: I1205 12:02:35.153668 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" event={"ID":"a5db489a-42dd-46c0-825d-5dc7065c9f29","Type":"ContainerStarted","Data":"511a104cdc453fe8b84895ad0c6ced7551002199294a8cd413aaafa92490a1bd"} Dec 05 12:02:35 crc kubenswrapper[4763]: I1205 12:02:35.177111 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-72lb2" podStartSLOduration=1.176212034 podStartE2EDuration="3.177091943s" podCreationTimestamp="2025-12-05 12:02:32 +0000 UTC" firstStartedPulling="2025-12-05 12:02:32.924692099 +0000 UTC m=+837.417406812" lastFinishedPulling="2025-12-05 12:02:34.925572008 +0000 UTC m=+839.418286721" observedRunningTime="2025-12-05 12:02:35.172128565 +0000 UTC m=+839.664843298" watchObservedRunningTime="2025-12-05 12:02:35.177091943 +0000 UTC m=+839.669806666" Dec 05 12:02:35 crc kubenswrapper[4763]: E1205 12:02:35.803928 4763 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/fa0cfef3294414e1e084f23742258038330a7b5418eed70d89a23b828008b026/diff" to get inode usage: stat /var/lib/containers/storage/overlay/fa0cfef3294414e1e084f23742258038330a7b5418eed70d89a23b828008b026/diff: no such file or directory, extraDiskErr: Dec 05 12:02:35 crc kubenswrapper[4763]: E1205 12:02:35.943153 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice/crio-93f289a16f9df54d6e7f48ba779e032b4ad1b112d3d44807e2ab206132646d34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9730875a_8f84_4f57_9f50_f83c08291fa7.slice\": RecentStats: unable to find data in memory cache]" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.140460 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.142082 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.150586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m6vfp" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.161506 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.184411 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-p7k9h"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.185088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.190569 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.191689 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.193390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzccz\" (UniqueName: \"kubernetes.io/projected/99db9e57-5946-4f9b-8664-d9a7fbff7042-kube-api-access-bzccz\") pod \"nmstate-metrics-7f946cbc9-t4n5p\" (UID: \"99db9e57-5946-4f9b-8664-d9a7fbff7042\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.195710 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.217664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.294988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-dbus-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-nmstate-lock\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295416 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp88b\" (UniqueName: \"kubernetes.io/projected/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-kube-api-access-pp88b\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295527 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzccz\" (UniqueName: \"kubernetes.io/projected/99db9e57-5946-4f9b-8664-d9a7fbff7042-kube-api-access-bzccz\") pod \"nmstate-metrics-7f946cbc9-t4n5p\" (UID: \"99db9e57-5946-4f9b-8664-d9a7fbff7042\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-ovs-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295730 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcknx\" (UniqueName: \"kubernetes.io/projected/73c52155-582b-4ea6-8661-c03a3804fe2e-kube-api-access-lcknx\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.295968 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.322389 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzccz\" (UniqueName: \"kubernetes.io/projected/99db9e57-5946-4f9b-8664-d9a7fbff7042-kube-api-access-bzccz\") pod \"nmstate-metrics-7f946cbc9-t4n5p\" (UID: \"99db9e57-5946-4f9b-8664-d9a7fbff7042\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.376083 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.376967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.380548 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mtjq7" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.380974 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.381050 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.391982 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-ovs-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcknx\" (UniqueName: \"kubernetes.io/projected/73c52155-582b-4ea6-8661-c03a3804fe2e-kube-api-access-lcknx\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-dbus-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-nmstate-lock\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp88b\" (UniqueName: \"kubernetes.io/projected/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-kube-api-access-pp88b\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.397928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-ovs-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: E1205 12:02:36.398081 4763 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 12:02:36 crc kubenswrapper[4763]: E1205 12:02:36.398119 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair podName:73c52155-582b-4ea6-8661-c03a3804fe2e nodeName:}" failed. No retries permitted until 2025-12-05 12:02:36.898105682 +0000 UTC m=+841.390820405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-4rx2l" (UID: "73c52155-582b-4ea6-8661-c03a3804fe2e") : secret "openshift-nmstate-webhook" not found Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.398367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-nmstate-lock\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.398456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-dbus-socket\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.432602 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp88b\" (UniqueName: \"kubernetes.io/projected/3d3db32e-6ad0-4e60-828f-74bdcc4cf6df-kube-api-access-pp88b\") pod \"nmstate-handler-p7k9h\" (UID: \"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df\") " pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.437268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcknx\" (UniqueName: \"kubernetes.io/projected/73c52155-582b-4ea6-8661-c03a3804fe2e-kube-api-access-lcknx\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.459855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.498405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a3cf5928-0003-41e3-baf7-670a1f186bde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.498678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mm6z\" (UniqueName: \"kubernetes.io/projected/a3cf5928-0003-41e3-baf7-670a1f186bde-kube-api-access-2mm6z\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.498809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.523901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.600406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a3cf5928-0003-41e3-baf7-670a1f186bde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.600746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6z\" (UniqueName: \"kubernetes.io/projected/a3cf5928-0003-41e3-baf7-670a1f186bde-kube-api-access-2mm6z\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.600898 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: E1205 12:02:36.601084 4763 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 05 12:02:36 crc kubenswrapper[4763]: E1205 12:02:36.601199 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert podName:a3cf5928-0003-41e3-baf7-670a1f186bde nodeName:}" failed. No retries permitted until 2025-12-05 12:02:37.101182481 +0000 UTC m=+841.593897204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-76lxj" (UID: "a3cf5928-0003-41e3-baf7-670a1f186bde") : secret "plugin-serving-cert" not found Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.602286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a3cf5928-0003-41e3-baf7-670a1f186bde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.629851 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mm6z\" (UniqueName: \"kubernetes.io/projected/a3cf5928-0003-41e3-baf7-670a1f186bde-kube-api-access-2mm6z\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.639531 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7864fc8cdd-sxdxv"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.640608 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.648516 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7864fc8cdd-sxdxv"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.701731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-oauth-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-oauth-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qws58\" (UniqueName: \"kubernetes.io/projected/c70ba7ec-bba8-415e-b564-910721c4358f-kube-api-access-qws58\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702315 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-console-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702375 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-trusted-ca-bundle\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-service-ca\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.702424 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804181 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-trusted-ca-bundle\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-service-ca\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804273 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804291 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-oauth-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-oauth-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-console-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.804397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qws58\" (UniqueName: \"kubernetes.io/projected/c70ba7ec-bba8-415e-b564-910721c4358f-kube-api-access-qws58\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.805364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-trusted-ca-bundle\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.806571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-service-ca\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.810365 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.811656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c70ba7ec-bba8-415e-b564-910721c4358f-console-oauth-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.811791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-oauth-serving-cert\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.815448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c70ba7ec-bba8-415e-b564-910721c4358f-console-config\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.834574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qws58\" (UniqueName: \"kubernetes.io/projected/c70ba7ec-bba8-415e-b564-910721c4358f-kube-api-access-qws58\") pod \"console-7864fc8cdd-sxdxv\" (UID: \"c70ba7ec-bba8-415e-b564-910721c4358f\") " pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.849156 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p"] Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.905721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.910034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/73c52155-582b-4ea6-8661-c03a3804fe2e-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4rx2l\" (UID: \"73c52155-582b-4ea6-8661-c03a3804fe2e\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:36 crc kubenswrapper[4763]: I1205 12:02:36.955301 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.109001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.112567 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3cf5928-0003-41e3-baf7-670a1f186bde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-76lxj\" (UID: \"a3cf5928-0003-41e3-baf7-670a1f186bde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.128649 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.169818 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7864fc8cdd-sxdxv"] Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.170210 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p7k9h" event={"ID":"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df","Type":"ContainerStarted","Data":"49d418c3019fd8654d05213930df687d42bf05100150d0e0cbaa39ef0ee298f7"} Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.171832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" event={"ID":"99db9e57-5946-4f9b-8664-d9a7fbff7042","Type":"ContainerStarted","Data":"bcd24f3294316cbe2bf3e353c1aa12f28a1c6d2a9ce5a217d7604e6120087168"} Dec 05 12:02:37 crc kubenswrapper[4763]: W1205 12:02:37.177173 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70ba7ec_bba8_415e_b564_910721c4358f.slice/crio-c6e57267d7dfc4ce9b4f5031a7bd77f363b24d845da460f9a13a5624c7770b6c WatchSource:0}: Error finding container c6e57267d7dfc4ce9b4f5031a7bd77f363b24d845da460f9a13a5624c7770b6c: Status 404 returned error can't find the container with id c6e57267d7dfc4ce9b4f5031a7bd77f363b24d845da460f9a13a5624c7770b6c Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.300167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.323021 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l"] Dec 05 12:02:37 crc kubenswrapper[4763]: W1205 12:02:37.324146 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c52155_582b_4ea6_8661_c03a3804fe2e.slice/crio-4dc2c371e7a9049f844bc1fe3b23c50eb302c5ffe33b8c0b5d6fd5cf58b27345 WatchSource:0}: Error finding container 4dc2c371e7a9049f844bc1fe3b23c50eb302c5ffe33b8c0b5d6fd5cf58b27345: Status 404 returned error can't find the container with id 4dc2c371e7a9049f844bc1fe3b23c50eb302c5ffe33b8c0b5d6fd5cf58b27345 Dec 05 12:02:37 crc kubenswrapper[4763]: I1205 12:02:37.730189 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj"] Dec 05 12:02:37 crc kubenswrapper[4763]: W1205 12:02:37.734993 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3cf5928_0003_41e3_baf7_670a1f186bde.slice/crio-1d4bc57825b490baf25587dfc7d9121b2591a7e26744ec59e5f7dca7d5ab396f WatchSource:0}: Error finding container 1d4bc57825b490baf25587dfc7d9121b2591a7e26744ec59e5f7dca7d5ab396f: Status 404 returned error can't find the container with id 1d4bc57825b490baf25587dfc7d9121b2591a7e26744ec59e5f7dca7d5ab396f Dec 05 12:02:38 crc kubenswrapper[4763]: I1205 12:02:38.179782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" event={"ID":"73c52155-582b-4ea6-8661-c03a3804fe2e","Type":"ContainerStarted","Data":"4dc2c371e7a9049f844bc1fe3b23c50eb302c5ffe33b8c0b5d6fd5cf58b27345"} Dec 05 12:02:38 crc kubenswrapper[4763]: I1205 12:02:38.180860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" event={"ID":"a3cf5928-0003-41e3-baf7-670a1f186bde","Type":"ContainerStarted","Data":"1d4bc57825b490baf25587dfc7d9121b2591a7e26744ec59e5f7dca7d5ab396f"} Dec 05 12:02:38 crc kubenswrapper[4763]: I1205 12:02:38.183885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7864fc8cdd-sxdxv" event={"ID":"c70ba7ec-bba8-415e-b564-910721c4358f","Type":"ContainerStarted","Data":"78dd138605c3ad854ebc30f599bdef0552d54198b571bee8b7ce75d216eb4a55"} Dec 05 12:02:38 crc kubenswrapper[4763]: I1205 12:02:38.183916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7864fc8cdd-sxdxv" event={"ID":"c70ba7ec-bba8-415e-b564-910721c4358f","Type":"ContainerStarted","Data":"c6e57267d7dfc4ce9b4f5031a7bd77f363b24d845da460f9a13a5624c7770b6c"} Dec 05 12:02:38 crc kubenswrapper[4763]: I1205 12:02:38.202911 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7864fc8cdd-sxdxv" podStartSLOduration=2.20289247 podStartE2EDuration="2.20289247s" podCreationTimestamp="2025-12-05 12:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:02:38.202438156 +0000 UTC m=+842.695152889" watchObservedRunningTime="2025-12-05 12:02:38.20289247 +0000 UTC m=+842.695607193" Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.203319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" event={"ID":"73c52155-582b-4ea6-8661-c03a3804fe2e","Type":"ContainerStarted","Data":"0e160ec8cc54d397b05633367ef33bcf3e3d545f797092310aaba84511b3ffec"} Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.204331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.205283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p7k9h" event={"ID":"3d3db32e-6ad0-4e60-828f-74bdcc4cf6df","Type":"ContainerStarted","Data":"b96447d5d798179ba4a55cac88df1c2a2fc47709cdc4312f5beb2ca51db63bfb"} Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.206123 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.208910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" event={"ID":"99db9e57-5946-4f9b-8664-d9a7fbff7042","Type":"ContainerStarted","Data":"17e78a07adc792a7dadb26b605494d992e14982122dd1dde4fd4c42850cb509f"} Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.211388 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" event={"ID":"a3cf5928-0003-41e3-baf7-670a1f186bde","Type":"ContainerStarted","Data":"feddebae13d24d71c3a0fec909f2f2812acd16a807baef17247751f0b54e67a2"} Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.237490 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" podStartSLOduration=1.79061223 podStartE2EDuration="4.237471607s" podCreationTimestamp="2025-12-05 12:02:36 +0000 UTC" firstStartedPulling="2025-12-05 12:02:37.326528433 +0000 UTC m=+841.819243146" lastFinishedPulling="2025-12-05 12:02:39.7733878 +0000 UTC m=+844.266102523" observedRunningTime="2025-12-05 12:02:40.223959434 +0000 UTC m=+844.716674167" watchObservedRunningTime="2025-12-05 12:02:40.237471607 +0000 UTC m=+844.730186330" Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.246127 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-p7k9h" podStartSLOduration=1.070894127 podStartE2EDuration="4.246111262s" podCreationTimestamp="2025-12-05 12:02:36 +0000 UTC" firstStartedPulling="2025-12-05 12:02:36.562647091 +0000 UTC m=+841.055361814" lastFinishedPulling="2025-12-05 12:02:39.737864216 +0000 UTC m=+844.230578949" observedRunningTime="2025-12-05 12:02:40.243773736 +0000 UTC m=+844.736488459" watchObservedRunningTime="2025-12-05 12:02:40.246111262 +0000 UTC m=+844.738825985" Dec 05 12:02:40 crc kubenswrapper[4763]: I1205 12:02:40.267234 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-76lxj" podStartSLOduration=2.2670836469999998 podStartE2EDuration="4.267217125s" podCreationTimestamp="2025-12-05 12:02:36 +0000 UTC" firstStartedPulling="2025-12-05 12:02:37.73689793 +0000 UTC m=+842.229612643" lastFinishedPulling="2025-12-05 12:02:39.737031378 +0000 UTC m=+844.229746121" observedRunningTime="2025-12-05 12:02:40.265588689 +0000 UTC m=+844.758303432" watchObservedRunningTime="2025-12-05 12:02:40.267217125 +0000 UTC m=+844.759931858" Dec 05 12:02:42 crc kubenswrapper[4763]: I1205 12:02:42.226379 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" event={"ID":"99db9e57-5946-4f9b-8664-d9a7fbff7042","Type":"ContainerStarted","Data":"c032f49fc619733def26e1c6cc89a8b529fe9d049e27d07e31f221e8d14d2caa"} Dec 05 12:02:42 crc kubenswrapper[4763]: I1205 12:02:42.246067 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-t4n5p" podStartSLOduration=1.499818035 podStartE2EDuration="6.24605156s" podCreationTimestamp="2025-12-05 12:02:36 +0000 UTC" firstStartedPulling="2025-12-05 12:02:36.854163676 +0000 UTC m=+841.346878399" lastFinishedPulling="2025-12-05 12:02:41.600397201 +0000 UTC m=+846.093111924" observedRunningTime="2025-12-05 12:02:42.241814925 +0000 UTC m=+846.734529658" watchObservedRunningTime="2025-12-05 12:02:42.24605156 +0000 UTC m=+846.738766283" Dec 05 12:02:46 crc kubenswrapper[4763]: I1205 12:02:46.564591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-p7k9h" Dec 05 12:02:46 crc kubenswrapper[4763]: I1205 12:02:46.956182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:46 crc kubenswrapper[4763]: I1205 12:02:46.956528 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:46 crc kubenswrapper[4763]: I1205 12:02:46.962665 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:47 crc kubenswrapper[4763]: I1205 12:02:47.265227 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7864fc8cdd-sxdxv" Dec 05 12:02:47 crc kubenswrapper[4763]: I1205 12:02:47.315672 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 12:02:57 crc kubenswrapper[4763]: I1205 12:02:57.134791 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4rx2l" Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.976135 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.977497 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.989481 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.997321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.997371 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:58 crc kubenswrapper[4763]: I1205 12:02:58.997452 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrh2\" (UniqueName: \"kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.098330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrh2\" (UniqueName: \"kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.098389 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.098414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.098924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.099062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.117617 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrh2\" (UniqueName: \"kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2\") pod \"redhat-marketplace-zx6rm\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.293294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:02:59 crc kubenswrapper[4763]: I1205 12:02:59.722997 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:03:00 crc kubenswrapper[4763]: I1205 12:03:00.367463 4763 generic.go:334] "Generic (PLEG): container finished" podID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerID="f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84" exitCode=0 Dec 05 12:03:00 crc kubenswrapper[4763]: I1205 12:03:00.367715 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerDied","Data":"f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84"} Dec 05 12:03:00 crc kubenswrapper[4763]: I1205 12:03:00.367749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerStarted","Data":"9b023e16947c68d55f77c3fa1398e36974c4f32a2f07b54d81471a44f8913693"} Dec 05 12:03:02 crc kubenswrapper[4763]: I1205 12:03:02.383170 4763 generic.go:334] "Generic (PLEG): container finished" podID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerID="38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f" exitCode=0 Dec 05 12:03:02 crc kubenswrapper[4763]: I1205 12:03:02.383227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerDied","Data":"38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f"} Dec 05 12:03:03 crc kubenswrapper[4763]: I1205 12:03:03.391409 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerStarted","Data":"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8"} Dec 05 12:03:03 crc kubenswrapper[4763]: I1205 12:03:03.414088 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zx6rm" podStartSLOduration=3.001179798 podStartE2EDuration="5.414071953s" podCreationTimestamp="2025-12-05 12:02:58 +0000 UTC" firstStartedPulling="2025-12-05 12:03:00.368997044 +0000 UTC m=+864.861711767" lastFinishedPulling="2025-12-05 12:03:02.781889199 +0000 UTC m=+867.274603922" observedRunningTime="2025-12-05 12:03:03.410491605 +0000 UTC m=+867.903206318" watchObservedRunningTime="2025-12-05 12:03:03.414071953 +0000 UTC m=+867.906786676" Dec 05 12:03:09 crc kubenswrapper[4763]: I1205 12:03:09.294120 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:09 crc kubenswrapper[4763]: I1205 12:03:09.294567 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:09 crc kubenswrapper[4763]: I1205 12:03:09.333201 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:09 crc kubenswrapper[4763]: I1205 12:03:09.487119 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:09 crc kubenswrapper[4763]: I1205 12:03:09.566798 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.612409 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw"] Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.613488 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.615692 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.630513 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw"] Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.697579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.697695 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.697754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2d9\" (UniqueName: \"kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.798781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.798853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.798881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2d9\" (UniqueName: \"kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.799396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.799594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.822189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2d9\" (UniqueName: \"kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:10 crc kubenswrapper[4763]: I1205 12:03:10.938353 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.355787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw"] Dec 05 12:03:11 crc kubenswrapper[4763]: W1205 12:03:11.365709 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6028549_2dc7_44a5_b84c_fb74585f3b85.slice/crio-392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8 WatchSource:0}: Error finding container 392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8: Status 404 returned error can't find the container with id 392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8 Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.459737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerStarted","Data":"392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8"} Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.459943 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zx6rm" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="registry-server" containerID="cri-o://6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8" gracePeriod=2 Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.773122 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.813166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nrh2\" (UniqueName: \"kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2\") pod \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.815108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content\") pod \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.815337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities\") pod \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\" (UID: \"a2b0afcd-2dde-40e8-9d19-d769eb57f372\") " Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.818302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities" (OuterVolumeSpecName: "utilities") pod "a2b0afcd-2dde-40e8-9d19-d769eb57f372" (UID: "a2b0afcd-2dde-40e8-9d19-d769eb57f372"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.827305 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2" (OuterVolumeSpecName: "kube-api-access-4nrh2") pod "a2b0afcd-2dde-40e8-9d19-d769eb57f372" (UID: "a2b0afcd-2dde-40e8-9d19-d769eb57f372"). InnerVolumeSpecName "kube-api-access-4nrh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.835724 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2b0afcd-2dde-40e8-9d19-d769eb57f372" (UID: "a2b0afcd-2dde-40e8-9d19-d769eb57f372"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.916403 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nrh2\" (UniqueName: \"kubernetes.io/projected/a2b0afcd-2dde-40e8-9d19-d769eb57f372-kube-api-access-4nrh2\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.916441 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:11 crc kubenswrapper[4763]: I1205 12:03:11.916451 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b0afcd-2dde-40e8-9d19-d769eb57f372-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.358373 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rsm7h" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerName="console" containerID="cri-o://0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d" gracePeriod=15 Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.474077 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerID="c8bbae365b78834c5a6f462e8448b0a49255d0f0e9569ee7c87f03015b81f1d0" exitCode=0 Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.474187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerDied","Data":"c8bbae365b78834c5a6f462e8448b0a49255d0f0e9569ee7c87f03015b81f1d0"} Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.478513 4763 generic.go:334] "Generic (PLEG): container finished" podID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerID="6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8" exitCode=0 Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.478551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerDied","Data":"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8"} Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.478573 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6rm" event={"ID":"a2b0afcd-2dde-40e8-9d19-d769eb57f372","Type":"ContainerDied","Data":"9b023e16947c68d55f77c3fa1398e36974c4f32a2f07b54d81471a44f8913693"} Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.478618 4763 scope.go:117] "RemoveContainer" containerID="6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.478722 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6rm" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.525187 4763 scope.go:117] "RemoveContainer" containerID="38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.539242 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.543499 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6rm"] Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.571502 4763 scope.go:117] "RemoveContainer" containerID="f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.584025 4763 scope.go:117] "RemoveContainer" containerID="6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8" Dec 05 12:03:12 crc kubenswrapper[4763]: E1205 12:03:12.584539 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8\": container with ID starting with 6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8 not found: ID does not exist" containerID="6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.584597 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8"} err="failed to get container status \"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8\": rpc error: code = NotFound desc = could not find container \"6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8\": container with ID starting with 6b5f68fd5998bb6287889cf01779abec3fce7215f7b21f908fe078349b7dbbc8 not found: ID does not exist" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.584628 4763 scope.go:117] "RemoveContainer" containerID="38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f" Dec 05 12:03:12 crc kubenswrapper[4763]: E1205 12:03:12.584919 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f\": container with ID starting with 38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f not found: ID does not exist" containerID="38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.584953 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f"} err="failed to get container status \"38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f\": rpc error: code = NotFound desc = could not find container \"38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f\": container with ID starting with 38f35c7662bfad14e50bd236ead4cb12ab1920c88a2493342bf8d64ff1d1681f not found: ID does not exist" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.584971 4763 scope.go:117] "RemoveContainer" containerID="f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84" Dec 05 12:03:12 crc kubenswrapper[4763]: E1205 12:03:12.585153 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84\": container with ID starting with f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84 not found: ID does not exist" containerID="f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.585177 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84"} err="failed to get container status \"f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84\": rpc error: code = NotFound desc = could not find container \"f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84\": container with ID starting with f72f5d783b38eca97a9a23e5dcc0ed398970802d98c9fe2ac7a2bc12066d1f84 not found: ID does not exist" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.729022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rsm7h_e57f38fd-b06b-447e-ad03-2a6fb918470b/console/0.log" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.729082 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928807 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qqrj\" (UniqueName: \"kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928975 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.928994 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.929052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert\") pod \"e57f38fd-b06b-447e-ad03-2a6fb918470b\" (UID: \"e57f38fd-b06b-447e-ad03-2a6fb918470b\") " Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.929659 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.929915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca" (OuterVolumeSpecName: "service-ca") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.929926 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config" (OuterVolumeSpecName: "console-config") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.930388 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.933270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.933311 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj" (OuterVolumeSpecName: "kube-api-access-7qqrj") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "kube-api-access-7qqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:03:12 crc kubenswrapper[4763]: I1205 12:03:12.933823 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e57f38fd-b06b-447e-ad03-2a6fb918470b" (UID: "e57f38fd-b06b-447e-ad03-2a6fb918470b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030475 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030534 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qqrj\" (UniqueName: \"kubernetes.io/projected/e57f38fd-b06b-447e-ad03-2a6fb918470b-kube-api-access-7qqrj\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030547 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030559 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030572 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030584 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.030595 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e57f38fd-b06b-447e-ad03-2a6fb918470b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.488517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerStarted","Data":"a56c204c4122d002daea835040926e1a1ab539ddfcba6c37f09aec4984c55f58"} Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491463 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rsm7h_e57f38fd-b06b-447e-ad03-2a6fb918470b/console/0.log" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491509 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerID="0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d" exitCode=2 Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rsm7h" event={"ID":"e57f38fd-b06b-447e-ad03-2a6fb918470b","Type":"ContainerDied","Data":"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d"} Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rsm7h" event={"ID":"e57f38fd-b06b-447e-ad03-2a6fb918470b","Type":"ContainerDied","Data":"5e3b37db8b165100d38c2503568bc57f155b85acdff68380372b697a6207b768"} Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491580 4763 scope.go:117] "RemoveContainer" containerID="0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.491680 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rsm7h" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.533610 4763 scope.go:117] "RemoveContainer" containerID="0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d" Dec 05 12:03:13 crc kubenswrapper[4763]: E1205 12:03:13.536687 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d\": container with ID starting with 0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d not found: ID does not exist" containerID="0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.536873 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d"} err="failed to get container status \"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d\": rpc error: code = NotFound desc = could not find container \"0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d\": container with ID starting with 0147495d916efe7d40133c81dcfaecc7e3c7b0539352d48cdd57e56e5297295d not found: ID does not exist" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.542043 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.545842 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rsm7h"] Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.791199 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" path="/var/lib/kubelet/pods/a2b0afcd-2dde-40e8-9d19-d769eb57f372/volumes" Dec 05 12:03:13 crc kubenswrapper[4763]: I1205 12:03:13.791851 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" path="/var/lib/kubelet/pods/e57f38fd-b06b-447e-ad03-2a6fb918470b/volumes" Dec 05 12:03:14 crc kubenswrapper[4763]: I1205 12:03:14.501705 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerID="a56c204c4122d002daea835040926e1a1ab539ddfcba6c37f09aec4984c55f58" exitCode=0 Dec 05 12:03:14 crc kubenswrapper[4763]: I1205 12:03:14.501751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerDied","Data":"a56c204c4122d002daea835040926e1a1ab539ddfcba6c37f09aec4984c55f58"} Dec 05 12:03:15 crc kubenswrapper[4763]: I1205 12:03:15.512443 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerID="3cafc3244c8a7f396ada0f38df07bf1a7d510ac1985fd89d40af882b633e73b9" exitCode=0 Dec 05 12:03:15 crc kubenswrapper[4763]: I1205 12:03:15.512505 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerDied","Data":"3cafc3244c8a7f396ada0f38df07bf1a7d510ac1985fd89d40af882b633e73b9"} Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.767369 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.780746 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2d9\" (UniqueName: \"kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9\") pod \"a6028549-2dc7-44a5-b84c-fb74585f3b85\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.782463 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle\") pod \"a6028549-2dc7-44a5-b84c-fb74585f3b85\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.782537 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util\") pod \"a6028549-2dc7-44a5-b84c-fb74585f3b85\" (UID: \"a6028549-2dc7-44a5-b84c-fb74585f3b85\") " Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.784552 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle" (OuterVolumeSpecName: "bundle") pod "a6028549-2dc7-44a5-b84c-fb74585f3b85" (UID: "a6028549-2dc7-44a5-b84c-fb74585f3b85"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.796422 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util" (OuterVolumeSpecName: "util") pod "a6028549-2dc7-44a5-b84c-fb74585f3b85" (UID: "a6028549-2dc7-44a5-b84c-fb74585f3b85"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.825023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9" (OuterVolumeSpecName: "kube-api-access-4h2d9") pod "a6028549-2dc7-44a5-b84c-fb74585f3b85" (UID: "a6028549-2dc7-44a5-b84c-fb74585f3b85"). InnerVolumeSpecName "kube-api-access-4h2d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.884044 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2d9\" (UniqueName: \"kubernetes.io/projected/a6028549-2dc7-44a5-b84c-fb74585f3b85-kube-api-access-4h2d9\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.884104 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:16 crc kubenswrapper[4763]: I1205 12:03:16.884121 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6028549-2dc7-44a5-b84c-fb74585f3b85-util\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:17 crc kubenswrapper[4763]: I1205 12:03:17.528453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" event={"ID":"a6028549-2dc7-44a5-b84c-fb74585f3b85","Type":"ContainerDied","Data":"392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8"} Dec 05 12:03:17 crc kubenswrapper[4763]: I1205 12:03:17.528797 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="392975daa42f991271049a80bec9c41e06b0cec9bc54ac1e4ab2e30fdb5765c8" Dec 05 12:03:17 crc kubenswrapper[4763]: I1205 12:03:17.528484 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.168679 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169255 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="registry-server" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169272 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="registry-server" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169280 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="pull" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169287 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="pull" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169300 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="extract-content" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169306 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="extract-content" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169320 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerName="console" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169327 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerName="console" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169337 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="extract-utilities" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="extract-utilities" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169357 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="util" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169365 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="util" Dec 05 12:03:20 crc kubenswrapper[4763]: E1205 12:03:20.169381 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="extract" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169389 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="extract" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169504 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57f38fd-b06b-447e-ad03-2a6fb918470b" containerName="console" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169520 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b0afcd-2dde-40e8-9d19-d769eb57f372" containerName="registry-server" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.169529 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6028549-2dc7-44a5-b84c-fb74585f3b85" containerName="extract" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.170543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.227017 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.230474 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.230577 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqzj\" (UniqueName: \"kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.230633 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.332301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.332430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.332465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqzj\" (UniqueName: \"kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.332983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.332995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.351804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqzj\" (UniqueName: \"kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj\") pod \"community-operators-nnthd\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.483955 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:20 crc kubenswrapper[4763]: I1205 12:03:20.774199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:21 crc kubenswrapper[4763]: I1205 12:03:21.550199 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerID="8929dac8e1835bb874d5b3b3e32d9db8f80eecea0907e3dcc2bfa44643188c03" exitCode=0 Dec 05 12:03:21 crc kubenswrapper[4763]: I1205 12:03:21.550268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerDied","Data":"8929dac8e1835bb874d5b3b3e32d9db8f80eecea0907e3dcc2bfa44643188c03"} Dec 05 12:03:21 crc kubenswrapper[4763]: I1205 12:03:21.550497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerStarted","Data":"6bbe5c9c453ed3e5aa9b66e28287371c1ca61d17bac45c10c4d81b86a8581509"} Dec 05 12:03:23 crc kubenswrapper[4763]: I1205 12:03:23.564982 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerID="f80c82f3b3bff34ade121b631428a443243b86f95187da3b5b27a95411ca1034" exitCode=0 Dec 05 12:03:23 crc kubenswrapper[4763]: I1205 12:03:23.565079 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerDied","Data":"f80c82f3b3bff34ade121b631428a443243b86f95187da3b5b27a95411ca1034"} Dec 05 12:03:24 crc kubenswrapper[4763]: I1205 12:03:24.573262 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerStarted","Data":"c48166a80c98272667c473e3fd0a17a38a3b7234b0543b5759ed68dd6cde94ca"} Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.444559 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g"] Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.445248 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: W1205 12:03:25.446981 4763 reflector.go:561] object-"metallb-system"/"metallb-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "metallb-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 12:03:25 crc kubenswrapper[4763]: E1205 12:03:25.447033 4763 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 12:03:25 crc kubenswrapper[4763]: W1205 12:03:25.447250 4763 reflector.go:561] object-"metallb-system"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 12:03:25 crc kubenswrapper[4763]: W1205 12:03:25.447261 4763 reflector.go:561] object-"metallb-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 12:03:25 crc kubenswrapper[4763]: E1205 12:03:25.447277 4763 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 12:03:25 crc kubenswrapper[4763]: E1205 12:03:25.447289 4763 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 12:03:25 crc kubenswrapper[4763]: W1205 12:03:25.447413 4763 reflector.go:561] object-"metallb-system"/"manager-account-dockercfg-2nxks": failed to list *v1.Secret: secrets "manager-account-dockercfg-2nxks" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 12:03:25 crc kubenswrapper[4763]: E1205 12:03:25.447458 4763 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"manager-account-dockercfg-2nxks\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"manager-account-dockercfg-2nxks\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.448176 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.459501 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g"] Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.500857 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-webhook-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.500917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p782t\" (UniqueName: \"kubernetes.io/projected/d47b8a4e-ccc5-41e4-855b-86fee8fed449-kube-api-access-p782t\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.500945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-apiservice-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.601770 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-webhook-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.601834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p782t\" (UniqueName: \"kubernetes.io/projected/d47b8a4e-ccc5-41e4-855b-86fee8fed449-kube-api-access-p782t\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.601865 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-apiservice-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.607583 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnthd" podStartSLOduration=3.168800899 podStartE2EDuration="5.607563959s" podCreationTimestamp="2025-12-05 12:03:20 +0000 UTC" firstStartedPulling="2025-12-05 12:03:21.551657097 +0000 UTC m=+886.044371820" lastFinishedPulling="2025-12-05 12:03:23.990420157 +0000 UTC m=+888.483134880" observedRunningTime="2025-12-05 12:03:25.60356602 +0000 UTC m=+890.096280743" watchObservedRunningTime="2025-12-05 12:03:25.607563959 +0000 UTC m=+890.100278682" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.814906 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5"] Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.815871 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.818152 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.818534 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-24p6c" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.818699 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.845229 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5"] Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.904570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-webhook-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.904616 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2s2\" (UniqueName: \"kubernetes.io/projected/e3b7dc32-b6b1-4087-9518-da66dd2c1839-kube-api-access-vk2s2\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:25 crc kubenswrapper[4763]: I1205 12:03:25.904640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-apiservice-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.006624 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-webhook-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.006674 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2s2\" (UniqueName: \"kubernetes.io/projected/e3b7dc32-b6b1-4087-9518-da66dd2c1839-kube-api-access-vk2s2\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.006709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-apiservice-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.013090 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-webhook-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.014447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3b7dc32-b6b1-4087-9518-da66dd2c1839-apiservice-cert\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.500514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.506075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-webhook-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.506089 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d47b8a4e-ccc5-41e4-855b-86fee8fed449-apiservice-cert\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.525796 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.843217 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2nxks" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.921355 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.934043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2s2\" (UniqueName: \"kubernetes.io/projected/e3b7dc32-b6b1-4087-9518-da66dd2c1839-kube-api-access-vk2s2\") pod \"metallb-operator-webhook-server-b69b886bc-52sm5\" (UID: \"e3b7dc32-b6b1-4087-9518-da66dd2c1839\") " pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.936816 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p782t\" (UniqueName: \"kubernetes.io/projected/d47b8a4e-ccc5-41e4-855b-86fee8fed449-kube-api-access-p782t\") pod \"metallb-operator-controller-manager-84d94bbc7d-rf87g\" (UID: \"d47b8a4e-ccc5-41e4-855b-86fee8fed449\") " pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:26 crc kubenswrapper[4763]: I1205 12:03:26.968890 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:27 crc kubenswrapper[4763]: I1205 12:03:27.029574 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:27 crc kubenswrapper[4763]: I1205 12:03:27.419784 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g"] Dec 05 12:03:27 crc kubenswrapper[4763]: W1205 12:03:27.424607 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47b8a4e_ccc5_41e4_855b_86fee8fed449.slice/crio-88d181c0b5fb6b96dc600e3f53f8987506ef3ace37c56800ee1bff9e568168fa WatchSource:0}: Error finding container 88d181c0b5fb6b96dc600e3f53f8987506ef3ace37c56800ee1bff9e568168fa: Status 404 returned error can't find the container with id 88d181c0b5fb6b96dc600e3f53f8987506ef3ace37c56800ee1bff9e568168fa Dec 05 12:03:27 crc kubenswrapper[4763]: I1205 12:03:27.467365 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5"] Dec 05 12:03:27 crc kubenswrapper[4763]: I1205 12:03:27.606565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" event={"ID":"e3b7dc32-b6b1-4087-9518-da66dd2c1839","Type":"ContainerStarted","Data":"81602bc69cacf7af73f1a7da5754f6d96fe529e85874e1b9108a46e0af80d711"} Dec 05 12:03:27 crc kubenswrapper[4763]: I1205 12:03:27.608030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" event={"ID":"d47b8a4e-ccc5-41e4-855b-86fee8fed449","Type":"ContainerStarted","Data":"88d181c0b5fb6b96dc600e3f53f8987506ef3ace37c56800ee1bff9e568168fa"} Dec 05 12:03:30 crc kubenswrapper[4763]: I1205 12:03:30.484273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:30 crc kubenswrapper[4763]: I1205 12:03:30.484513 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:30 crc kubenswrapper[4763]: I1205 12:03:30.524525 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:30 crc kubenswrapper[4763]: I1205 12:03:30.695087 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:32 crc kubenswrapper[4763]: I1205 12:03:32.160449 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:32 crc kubenswrapper[4763]: I1205 12:03:32.641447 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnthd" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="registry-server" containerID="cri-o://c48166a80c98272667c473e3fd0a17a38a3b7234b0543b5759ed68dd6cde94ca" gracePeriod=2 Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.654618 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerID="c48166a80c98272667c473e3fd0a17a38a3b7234b0543b5759ed68dd6cde94ca" exitCode=0 Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.654694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerDied","Data":"c48166a80c98272667c473e3fd0a17a38a3b7234b0543b5759ed68dd6cde94ca"} Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.726090 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.740744 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content\") pod \"a6c92224-ab9a-4803-8f97-d2cb231847b4\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.741001 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities\") pod \"a6c92224-ab9a-4803-8f97-d2cb231847b4\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.741092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptqzj\" (UniqueName: \"kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj\") pod \"a6c92224-ab9a-4803-8f97-d2cb231847b4\" (UID: \"a6c92224-ab9a-4803-8f97-d2cb231847b4\") " Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.742184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities" (OuterVolumeSpecName: "utilities") pod "a6c92224-ab9a-4803-8f97-d2cb231847b4" (UID: "a6c92224-ab9a-4803-8f97-d2cb231847b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.751352 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj" (OuterVolumeSpecName: "kube-api-access-ptqzj") pod "a6c92224-ab9a-4803-8f97-d2cb231847b4" (UID: "a6c92224-ab9a-4803-8f97-d2cb231847b4"). InnerVolumeSpecName "kube-api-access-ptqzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.808984 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6c92224-ab9a-4803-8f97-d2cb231847b4" (UID: "a6c92224-ab9a-4803-8f97-d2cb231847b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.842094 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.842163 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c92224-ab9a-4803-8f97-d2cb231847b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:33 crc kubenswrapper[4763]: I1205 12:03:33.842174 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptqzj\" (UniqueName: \"kubernetes.io/projected/a6c92224-ab9a-4803-8f97-d2cb231847b4-kube-api-access-ptqzj\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.663620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnthd" event={"ID":"a6c92224-ab9a-4803-8f97-d2cb231847b4","Type":"ContainerDied","Data":"6bbe5c9c453ed3e5aa9b66e28287371c1ca61d17bac45c10c4d81b86a8581509"} Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.663682 4763 scope.go:117] "RemoveContainer" containerID="c48166a80c98272667c473e3fd0a17a38a3b7234b0543b5759ed68dd6cde94ca" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.663683 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnthd" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.665108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" event={"ID":"d47b8a4e-ccc5-41e4-855b-86fee8fed449","Type":"ContainerStarted","Data":"d99b18b36732aab3dc96736df0adce91a7994fd76be8e549b9ad98e0fde87aa7"} Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.665238 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.666552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" event={"ID":"e3b7dc32-b6b1-4087-9518-da66dd2c1839","Type":"ContainerStarted","Data":"eda4ba44ef1570d949ebd5fdc026303a71b5ef85efcd2c62d59f95ec2ba665e6"} Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.666698 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.684374 4763 scope.go:117] "RemoveContainer" containerID="f80c82f3b3bff34ade121b631428a443243b86f95187da3b5b27a95411ca1034" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.705289 4763 scope.go:117] "RemoveContainer" containerID="8929dac8e1835bb874d5b3b3e32d9db8f80eecea0907e3dcc2bfa44643188c03" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.707061 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" podStartSLOduration=3.610805577 podStartE2EDuration="9.707047869s" podCreationTimestamp="2025-12-05 12:03:25 +0000 UTC" firstStartedPulling="2025-12-05 12:03:27.427464669 +0000 UTC m=+891.920179392" lastFinishedPulling="2025-12-05 12:03:33.523706961 +0000 UTC m=+898.016421684" observedRunningTime="2025-12-05 12:03:34.692388799 +0000 UTC m=+899.185103532" watchObservedRunningTime="2025-12-05 12:03:34.707047869 +0000 UTC m=+899.199762602" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.732863 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" podStartSLOduration=3.6596547189999997 podStartE2EDuration="9.732843832s" podCreationTimestamp="2025-12-05 12:03:25 +0000 UTC" firstStartedPulling="2025-12-05 12:03:27.478262634 +0000 UTC m=+891.970977357" lastFinishedPulling="2025-12-05 12:03:33.551451747 +0000 UTC m=+898.044166470" observedRunningTime="2025-12-05 12:03:34.710666967 +0000 UTC m=+899.203381730" watchObservedRunningTime="2025-12-05 12:03:34.732843832 +0000 UTC m=+899.225558575" Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.740083 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:34 crc kubenswrapper[4763]: I1205 12:03:34.746335 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnthd"] Dec 05 12:03:35 crc kubenswrapper[4763]: I1205 12:03:35.790792 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" path="/var/lib/kubelet/pods/a6c92224-ab9a-4803-8f97-d2cb231847b4/volumes" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.572606 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:39 crc kubenswrapper[4763]: E1205 12:03:39.573331 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="extract-content" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.573344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="extract-content" Dec 05 12:03:39 crc kubenswrapper[4763]: E1205 12:03:39.573363 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="extract-utilities" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.573369 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="extract-utilities" Dec 05 12:03:39 crc kubenswrapper[4763]: E1205 12:03:39.573383 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="registry-server" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.573390 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="registry-server" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.573485 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c92224-ab9a-4803-8f97-d2cb231847b4" containerName="registry-server" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.574552 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.589782 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.643347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.643390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86qtd\" (UniqueName: \"kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.643422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.745245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.745312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86qtd\" (UniqueName: \"kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.745353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.745880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.745929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.771715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86qtd\" (UniqueName: \"kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd\") pod \"certified-operators-hxk55\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:39 crc kubenswrapper[4763]: I1205 12:03:39.940574 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:40 crc kubenswrapper[4763]: W1205 12:03:40.393381 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dc12c9_74a1_433b_b4b2_559e19b769bc.slice/crio-fab2bdd9fa05c2ed52db72a8e128a65c3c76da31dd484b0f577ba1b2e91e6ca8 WatchSource:0}: Error finding container fab2bdd9fa05c2ed52db72a8e128a65c3c76da31dd484b0f577ba1b2e91e6ca8: Status 404 returned error can't find the container with id fab2bdd9fa05c2ed52db72a8e128a65c3c76da31dd484b0f577ba1b2e91e6ca8 Dec 05 12:03:40 crc kubenswrapper[4763]: I1205 12:03:40.396474 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:40 crc kubenswrapper[4763]: I1205 12:03:40.700882 4763 generic.go:334] "Generic (PLEG): container finished" podID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerID="9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff" exitCode=0 Dec 05 12:03:40 crc kubenswrapper[4763]: I1205 12:03:40.700924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerDied","Data":"9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff"} Dec 05 12:03:40 crc kubenswrapper[4763]: I1205 12:03:40.700956 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerStarted","Data":"fab2bdd9fa05c2ed52db72a8e128a65c3c76da31dd484b0f577ba1b2e91e6ca8"} Dec 05 12:03:41 crc kubenswrapper[4763]: I1205 12:03:41.719325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerStarted","Data":"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929"} Dec 05 12:03:42 crc kubenswrapper[4763]: I1205 12:03:42.727528 4763 generic.go:334] "Generic (PLEG): container finished" podID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerID="f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929" exitCode=0 Dec 05 12:03:42 crc kubenswrapper[4763]: I1205 12:03:42.727575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerDied","Data":"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929"} Dec 05 12:03:43 crc kubenswrapper[4763]: I1205 12:03:43.736830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerStarted","Data":"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253"} Dec 05 12:03:43 crc kubenswrapper[4763]: I1205 12:03:43.756647 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hxk55" podStartSLOduration=2.2968515050000002 podStartE2EDuration="4.756631177s" podCreationTimestamp="2025-12-05 12:03:39 +0000 UTC" firstStartedPulling="2025-12-05 12:03:40.702253506 +0000 UTC m=+905.194968229" lastFinishedPulling="2025-12-05 12:03:43.162033168 +0000 UTC m=+907.654747901" observedRunningTime="2025-12-05 12:03:43.753143742 +0000 UTC m=+908.245858475" watchObservedRunningTime="2025-12-05 12:03:43.756631177 +0000 UTC m=+908.249345900" Dec 05 12:03:47 crc kubenswrapper[4763]: I1205 12:03:47.036649 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b69b886bc-52sm5" Dec 05 12:03:49 crc kubenswrapper[4763]: I1205 12:03:49.940822 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:49 crc kubenswrapper[4763]: I1205 12:03:49.942148 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:49 crc kubenswrapper[4763]: I1205 12:03:49.981215 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:50 crc kubenswrapper[4763]: I1205 12:03:50.826746 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:52 crc kubenswrapper[4763]: I1205 12:03:52.363892 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:52 crc kubenswrapper[4763]: I1205 12:03:52.788624 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hxk55" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="registry-server" containerID="cri-o://710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253" gracePeriod=2 Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.657191 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.719504 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities\") pod \"16dc12c9-74a1-433b-b4b2-559e19b769bc\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.719590 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86qtd\" (UniqueName: \"kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd\") pod \"16dc12c9-74a1-433b-b4b2-559e19b769bc\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.719635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content\") pod \"16dc12c9-74a1-433b-b4b2-559e19b769bc\" (UID: \"16dc12c9-74a1-433b-b4b2-559e19b769bc\") " Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.720545 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities" (OuterVolumeSpecName: "utilities") pod "16dc12c9-74a1-433b-b4b2-559e19b769bc" (UID: "16dc12c9-74a1-433b-b4b2-559e19b769bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.727931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd" (OuterVolumeSpecName: "kube-api-access-86qtd") pod "16dc12c9-74a1-433b-b4b2-559e19b769bc" (UID: "16dc12c9-74a1-433b-b4b2-559e19b769bc"). InnerVolumeSpecName "kube-api-access-86qtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.766027 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16dc12c9-74a1-433b-b4b2-559e19b769bc" (UID: "16dc12c9-74a1-433b-b4b2-559e19b769bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.797260 4763 generic.go:334] "Generic (PLEG): container finished" podID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerID="710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253" exitCode=0 Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.797309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerDied","Data":"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253"} Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.797343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxk55" event={"ID":"16dc12c9-74a1-433b-b4b2-559e19b769bc","Type":"ContainerDied","Data":"fab2bdd9fa05c2ed52db72a8e128a65c3c76da31dd484b0f577ba1b2e91e6ca8"} Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.797363 4763 scope.go:117] "RemoveContainer" containerID="710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.797400 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxk55" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.815868 4763 scope.go:117] "RemoveContainer" containerID="f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.829290 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.829328 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dc12c9-74a1-433b-b4b2-559e19b769bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.829344 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86qtd\" (UniqueName: \"kubernetes.io/projected/16dc12c9-74a1-433b-b4b2-559e19b769bc-kube-api-access-86qtd\") on node \"crc\" DevicePath \"\"" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.833064 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.837802 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hxk55"] Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.856625 4763 scope.go:117] "RemoveContainer" containerID="9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.878581 4763 scope.go:117] "RemoveContainer" containerID="710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253" Dec 05 12:03:53 crc kubenswrapper[4763]: E1205 12:03:53.879157 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253\": container with ID starting with 710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253 not found: ID does not exist" containerID="710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.879198 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253"} err="failed to get container status \"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253\": rpc error: code = NotFound desc = could not find container \"710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253\": container with ID starting with 710faba8c16cdf9ed4e0bc131ecf0ec022efaaa3f502a569c30d7606410f3253 not found: ID does not exist" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.879229 4763 scope.go:117] "RemoveContainer" containerID="f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929" Dec 05 12:03:53 crc kubenswrapper[4763]: E1205 12:03:53.879902 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929\": container with ID starting with f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929 not found: ID does not exist" containerID="f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.879931 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929"} err="failed to get container status \"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929\": rpc error: code = NotFound desc = could not find container \"f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929\": container with ID starting with f92f6d697794c3ed182f19d70c8945b72ccf9358f93ccf66c4bc60929c6f7929 not found: ID does not exist" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.879953 4763 scope.go:117] "RemoveContainer" containerID="9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff" Dec 05 12:03:53 crc kubenswrapper[4763]: E1205 12:03:53.880428 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff\": container with ID starting with 9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff not found: ID does not exist" containerID="9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff" Dec 05 12:03:53 crc kubenswrapper[4763]: I1205 12:03:53.880493 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff"} err="failed to get container status \"9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff\": rpc error: code = NotFound desc = could not find container \"9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff\": container with ID starting with 9e97205bd429ceb65e38e1a63d19aa6d9857a1f3da0828cdde4f79bd4f1a06ff not found: ID does not exist" Dec 05 12:03:55 crc kubenswrapper[4763]: I1205 12:03:55.792261 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" path="/var/lib/kubelet/pods/16dc12c9-74a1-433b-b4b2-559e19b769bc/volumes" Dec 05 12:04:06 crc kubenswrapper[4763]: I1205 12:04:06.972226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84d94bbc7d-rf87g" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.544096 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.544990 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.743286 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ms42j"] Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.743847 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="registry-server" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.743955 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="registry-server" Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.744051 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="extract-utilities" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.744132 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="extract-utilities" Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.744226 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="extract-content" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.744319 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="extract-content" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.744528 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dc12c9-74a1-433b-b4b2-559e19b769bc" containerName="registry-server" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.747683 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl"] Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.747902 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.748559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.751006 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.751217 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.751498 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.752406 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ngprh" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.769486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl"] Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.803346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-reloader\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-startup\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbf2f\" (UniqueName: \"kubernetes.io/projected/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-kube-api-access-vbf2f\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-sockets\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-conf\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.804922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n96jb\" (UniqueName: \"kubernetes.io/projected/537c61d5-e548-4c96-b7ed-24fcc061e9ac-kube-api-access-n96jb\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.837336 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2k2k4"] Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.838262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2k2k4" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.840231 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.840825 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fqnql" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.840900 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.844072 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.850960 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-scmzk"] Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.851929 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.854417 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.878132 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-scmzk"] Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906153 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-sockets\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906171 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-conf\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906209 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9bsq\" (UniqueName: \"kubernetes.io/projected/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-kube-api-access-k9bsq\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n96jb\" (UniqueName: \"kubernetes.io/projected/537c61d5-e548-4c96-b7ed-24fcc061e9ac-kube-api-access-n96jb\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl6h9\" (UniqueName: \"kubernetes.io/projected/a34bf611-cb4c-44b4-bdf2-45a656edadc9-kube-api-access-zl6h9\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-reloader\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-startup\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metallb-excludel2\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-cert\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbf2f\" (UniqueName: \"kubernetes.io/projected/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-kube-api-access-vbf2f\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906458 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metrics-certs\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906482 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.906601 4763 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.906639 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-sockets\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.907039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.906650 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs podName:537c61d5-e548-4c96-b7ed-24fcc061e9ac nodeName:}" failed. No retries permitted until 2025-12-05 12:04:08.40663491 +0000 UTC m=+932.899349633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs") pod "frr-k8s-ms42j" (UID: "537c61d5-e548-4c96-b7ed-24fcc061e9ac") : secret "frr-k8s-certs-secret" not found Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.907106 4763 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 05 12:04:07 crc kubenswrapper[4763]: E1205 12:04:07.907150 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert podName:cfa0736f-2856-4cfd-810f-d8fcd2bea7f6 nodeName:}" failed. No retries permitted until 2025-12-05 12:04:08.407139003 +0000 UTC m=+932.899853736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert") pod "frr-k8s-webhook-server-7fcb986d4-qv5hl" (UID: "cfa0736f-2856-4cfd-810f-d8fcd2bea7f6") : secret "frr-k8s-webhook-server-cert" not found Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.907118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-conf\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.907469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/537c61d5-e548-4c96-b7ed-24fcc061e9ac-reloader\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.908118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/537c61d5-e548-4c96-b7ed-24fcc061e9ac-frr-startup\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.927522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n96jb\" (UniqueName: \"kubernetes.io/projected/537c61d5-e548-4c96-b7ed-24fcc061e9ac-kube-api-access-n96jb\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:07 crc kubenswrapper[4763]: I1205 12:04:07.943428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbf2f\" (UniqueName: \"kubernetes.io/projected/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-kube-api-access-vbf2f\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007488 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9bsq\" (UniqueName: \"kubernetes.io/projected/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-kube-api-access-k9bsq\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl6h9\" (UniqueName: \"kubernetes.io/projected/a34bf611-cb4c-44b4-bdf2-45a656edadc9-kube-api-access-zl6h9\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007627 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metallb-excludel2\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007657 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-cert\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.007724 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metrics-certs\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.008552 4763 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.008622 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs podName:a34bf611-cb4c-44b4-bdf2-45a656edadc9 nodeName:}" failed. No retries permitted until 2025-12-05 12:04:08.508603429 +0000 UTC m=+933.001318152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs") pod "controller-f8648f98b-scmzk" (UID: "a34bf611-cb4c-44b4-bdf2-45a656edadc9") : secret "controller-certs-secret" not found Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.008841 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.008985 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist podName:6af0da26-fcd3-4eb1-97a2-e5beedf81d5b nodeName:}" failed. No retries permitted until 2025-12-05 12:04:08.508948249 +0000 UTC m=+933.001662972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist") pod "speaker-2k2k4" (UID: "6af0da26-fcd3-4eb1-97a2-e5beedf81d5b") : secret "metallb-memberlist" not found Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.009256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metallb-excludel2\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.010716 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.011947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-metrics-certs\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.025944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-cert\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.030543 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl6h9\" (UniqueName: \"kubernetes.io/projected/a34bf611-cb4c-44b4-bdf2-45a656edadc9-kube-api-access-zl6h9\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.031503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9bsq\" (UniqueName: \"kubernetes.io/projected/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-kube-api-access-k9bsq\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.419795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.420103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.424646 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/537c61d5-e548-4c96-b7ed-24fcc061e9ac-metrics-certs\") pod \"frr-k8s-ms42j\" (UID: \"537c61d5-e548-4c96-b7ed-24fcc061e9ac\") " pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.425205 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa0736f-2856-4cfd-810f-d8fcd2bea7f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qv5hl\" (UID: \"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.521616 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.521698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.522022 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 12:04:08 crc kubenswrapper[4763]: E1205 12:04:08.522064 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist podName:6af0da26-fcd3-4eb1-97a2-e5beedf81d5b nodeName:}" failed. No retries permitted until 2025-12-05 12:04:09.522050015 +0000 UTC m=+934.014764738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist") pod "speaker-2k2k4" (UID: "6af0da26-fcd3-4eb1-97a2-e5beedf81d5b") : secret "metallb-memberlist" not found Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.525592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34bf611-cb4c-44b4-bdf2-45a656edadc9-metrics-certs\") pod \"controller-f8648f98b-scmzk\" (UID: \"a34bf611-cb4c-44b4-bdf2-45a656edadc9\") " pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.669634 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.680054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:08 crc kubenswrapper[4763]: I1205 12:04:08.766060 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.019484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-scmzk"] Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.161448 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl"] Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.538625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:09 crc kubenswrapper[4763]: E1205 12:04:09.538808 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 12:04:09 crc kubenswrapper[4763]: E1205 12:04:09.538877 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist podName:6af0da26-fcd3-4eb1-97a2-e5beedf81d5b nodeName:}" failed. No retries permitted until 2025-12-05 12:04:11.538859813 +0000 UTC m=+936.031574536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist") pod "speaker-2k2k4" (UID: "6af0da26-fcd3-4eb1-97a2-e5beedf81d5b") : secret "metallb-memberlist" not found Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.894370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-scmzk" event={"ID":"a34bf611-cb4c-44b4-bdf2-45a656edadc9","Type":"ContainerStarted","Data":"3b1b51b5bb2b441c7d74ba85a258c626582b8d515fe5fe694a056c828cc322d1"} Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.894440 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.894452 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-scmzk" event={"ID":"a34bf611-cb4c-44b4-bdf2-45a656edadc9","Type":"ContainerStarted","Data":"773d893b5a89be9710c601ce511aef2a2a1d6ccce32195da47670e329d8c5888"} Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.894462 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-scmzk" event={"ID":"a34bf611-cb4c-44b4-bdf2-45a656edadc9","Type":"ContainerStarted","Data":"7682b36da3c54b8ab8457d50d019f759b67a00f7ad98605d2696d72c9944fb9c"} Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.896745 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" event={"ID":"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6","Type":"ContainerStarted","Data":"ba4309e1cf5de5c9bcee13ac6ad0823a7f28385497283079c928c436bc19f2fc"} Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.898238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"9a865fe96c5637b892e9000b6ac1439d40d096215c482d810b5e3381cc6ad394"} Dec 05 12:04:09 crc kubenswrapper[4763]: I1205 12:04:09.918155 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-scmzk" podStartSLOduration=2.918139412 podStartE2EDuration="2.918139412s" podCreationTimestamp="2025-12-05 12:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:04:09.914166944 +0000 UTC m=+934.406881677" watchObservedRunningTime="2025-12-05 12:04:09.918139412 +0000 UTC m=+934.410854135" Dec 05 12:04:11 crc kubenswrapper[4763]: I1205 12:04:11.569418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:11 crc kubenswrapper[4763]: I1205 12:04:11.581115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6af0da26-fcd3-4eb1-97a2-e5beedf81d5b-memberlist\") pod \"speaker-2k2k4\" (UID: \"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b\") " pod="metallb-system/speaker-2k2k4" Dec 05 12:04:11 crc kubenswrapper[4763]: I1205 12:04:11.753458 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2k2k4" Dec 05 12:04:11 crc kubenswrapper[4763]: I1205 12:04:11.916196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2k2k4" event={"ID":"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b","Type":"ContainerStarted","Data":"c09d963ff39dcf8dbb2440f256953e6bc6720e7d9ce36ecb52c32445a93a8729"} Dec 05 12:04:12 crc kubenswrapper[4763]: I1205 12:04:12.934522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2k2k4" event={"ID":"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b","Type":"ContainerStarted","Data":"ea34be81b17b7e13ced96292c9403a15f77ce49cbba7943f1355eb9610eb6e68"} Dec 05 12:04:12 crc kubenswrapper[4763]: I1205 12:04:12.934589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2k2k4" event={"ID":"6af0da26-fcd3-4eb1-97a2-e5beedf81d5b","Type":"ContainerStarted","Data":"8d46c7a79fe5ffd9c35cd1c41f03a4022cb31b2f4da074a15b6e52ff5804a336"} Dec 05 12:04:12 crc kubenswrapper[4763]: I1205 12:04:12.934945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2k2k4" Dec 05 12:04:12 crc kubenswrapper[4763]: I1205 12:04:12.955721 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2k2k4" podStartSLOduration=5.955702985 podStartE2EDuration="5.955702985s" podCreationTimestamp="2025-12-05 12:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:04:12.952166579 +0000 UTC m=+937.444881302" watchObservedRunningTime="2025-12-05 12:04:12.955702985 +0000 UTC m=+937.448417708" Dec 05 12:04:16 crc kubenswrapper[4763]: I1205 12:04:16.974419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" event={"ID":"cfa0736f-2856-4cfd-810f-d8fcd2bea7f6","Type":"ContainerStarted","Data":"7cfc2f9886cb5dd4645e7409311126956757f0925331703ec5fca70742567698"} Dec 05 12:04:16 crc kubenswrapper[4763]: I1205 12:04:16.975069 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:16 crc kubenswrapper[4763]: I1205 12:04:16.977876 4763 generic.go:334] "Generic (PLEG): container finished" podID="537c61d5-e548-4c96-b7ed-24fcc061e9ac" containerID="5f7a47033e5110d5d7cdba43bfc422c2fe35c24f45231f627f9a8f1cccf9549d" exitCode=0 Dec 05 12:04:16 crc kubenswrapper[4763]: I1205 12:04:16.977943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerDied","Data":"5f7a47033e5110d5d7cdba43bfc422c2fe35c24f45231f627f9a8f1cccf9549d"} Dec 05 12:04:16 crc kubenswrapper[4763]: I1205 12:04:16.995281 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" podStartSLOduration=2.433300255 podStartE2EDuration="9.9952593s" podCreationTimestamp="2025-12-05 12:04:07 +0000 UTC" firstStartedPulling="2025-12-05 12:04:09.183097235 +0000 UTC m=+933.675811958" lastFinishedPulling="2025-12-05 12:04:16.74505628 +0000 UTC m=+941.237771003" observedRunningTime="2025-12-05 12:04:16.990503751 +0000 UTC m=+941.483218474" watchObservedRunningTime="2025-12-05 12:04:16.9952593 +0000 UTC m=+941.487974033" Dec 05 12:04:17 crc kubenswrapper[4763]: I1205 12:04:17.988953 4763 generic.go:334] "Generic (PLEG): container finished" podID="537c61d5-e548-4c96-b7ed-24fcc061e9ac" containerID="e5f4185beff1346a5af13ae7baaccd2119add627688de15c03c70249309c8ec5" exitCode=0 Dec 05 12:04:17 crc kubenswrapper[4763]: I1205 12:04:17.989076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerDied","Data":"e5f4185beff1346a5af13ae7baaccd2119add627688de15c03c70249309c8ec5"} Dec 05 12:04:19 crc kubenswrapper[4763]: I1205 12:04:18.999856 4763 generic.go:334] "Generic (PLEG): container finished" podID="537c61d5-e548-4c96-b7ed-24fcc061e9ac" containerID="3e0caa01ab3c8e22dc02502321bc6e1a7d81a2b19d1697d47d04b1e0ebe7d31f" exitCode=0 Dec 05 12:04:19 crc kubenswrapper[4763]: I1205 12:04:18.999998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerDied","Data":"3e0caa01ab3c8e22dc02502321bc6e1a7d81a2b19d1697d47d04b1e0ebe7d31f"} Dec 05 12:04:20 crc kubenswrapper[4763]: I1205 12:04:20.018559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"7e95eef5433fa29bca1ccf86e9af3483fadc2b17aaea40890c4f19c7831b9877"} Dec 05 12:04:20 crc kubenswrapper[4763]: I1205 12:04:20.019081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"1ec0db56d82371faf61bb1ded305b2ac8c25b75dc920b307a7cfad1d6105b5e7"} Dec 05 12:04:20 crc kubenswrapper[4763]: I1205 12:04:20.019092 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"51433e929e0420fd25ef780b48cca1ad2bb703b42a42c387c012061071ce7081"} Dec 05 12:04:20 crc kubenswrapper[4763]: I1205 12:04:20.019101 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"3b333eb875e3efe3d38845c6233c7b1d64fca6f52f0096c34fea13728b2a0847"} Dec 05 12:04:20 crc kubenswrapper[4763]: I1205 12:04:20.019111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"68d40ee24237f2acea4184393abd678ddbf1fa00f40f03aeafd91a1d85aba984"} Dec 05 12:04:21 crc kubenswrapper[4763]: I1205 12:04:21.036464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ms42j" event={"ID":"537c61d5-e548-4c96-b7ed-24fcc061e9ac","Type":"ContainerStarted","Data":"7d1f51f1d09bb6fc870c99521a2490a2b6258649b6ae754ed9d29e9d3e1d9394"} Dec 05 12:04:21 crc kubenswrapper[4763]: I1205 12:04:21.038859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:21 crc kubenswrapper[4763]: I1205 12:04:21.081124 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ms42j" podStartSLOduration=6.6647386619999995 podStartE2EDuration="14.081106399s" podCreationTimestamp="2025-12-05 12:04:07 +0000 UTC" firstStartedPulling="2025-12-05 12:04:09.364369826 +0000 UTC m=+933.857084569" lastFinishedPulling="2025-12-05 12:04:16.780737583 +0000 UTC m=+941.273452306" observedRunningTime="2025-12-05 12:04:21.078844498 +0000 UTC m=+945.571559231" watchObservedRunningTime="2025-12-05 12:04:21.081106399 +0000 UTC m=+945.573821122" Dec 05 12:04:23 crc kubenswrapper[4763]: I1205 12:04:23.670569 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:23 crc kubenswrapper[4763]: I1205 12:04:23.710280 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:28 crc kubenswrapper[4763]: I1205 12:04:28.684263 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" Dec 05 12:04:28 crc kubenswrapper[4763]: I1205 12:04:28.770146 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-scmzk" Dec 05 12:04:31 crc kubenswrapper[4763]: I1205 12:04:31.759369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2k2k4" Dec 05 12:04:37 crc kubenswrapper[4763]: I1205 12:04:37.543888 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:04:37 crc kubenswrapper[4763]: I1205 12:04:37.544510 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.373150 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.374548 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.380659 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-27kq4" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.380988 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.383156 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.388194 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.439290 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csk9\" (UniqueName: \"kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9\") pod \"openstack-operator-index-mgzr5\" (UID: \"b7b8a187-cdfd-4966-92ba-9d535e9f365e\") " pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.540985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csk9\" (UniqueName: \"kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9\") pod \"openstack-operator-index-mgzr5\" (UID: \"b7b8a187-cdfd-4966-92ba-9d535e9f365e\") " pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.563173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csk9\" (UniqueName: \"kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9\") pod \"openstack-operator-index-mgzr5\" (UID: \"b7b8a187-cdfd-4966-92ba-9d535e9f365e\") " pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.673999 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ms42j" Dec 05 12:04:38 crc kubenswrapper[4763]: I1205 12:04:38.703097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:39 crc kubenswrapper[4763]: I1205 12:04:39.109386 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:39 crc kubenswrapper[4763]: I1205 12:04:39.160607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mgzr5" event={"ID":"b7b8a187-cdfd-4966-92ba-9d535e9f365e","Type":"ContainerStarted","Data":"c1c0d74f4c1fd0234ce0d7a288e0fd224d4f6484b5a063eb342404a75a43ab73"} Dec 05 12:04:43 crc kubenswrapper[4763]: I1205 12:04:43.191503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mgzr5" event={"ID":"b7b8a187-cdfd-4966-92ba-9d535e9f365e","Type":"ContainerStarted","Data":"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f"} Dec 05 12:04:43 crc kubenswrapper[4763]: I1205 12:04:43.218080 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mgzr5" podStartSLOduration=1.564672545 podStartE2EDuration="5.218052285s" podCreationTimestamp="2025-12-05 12:04:38 +0000 UTC" firstStartedPulling="2025-12-05 12:04:39.119654734 +0000 UTC m=+963.612369477" lastFinishedPulling="2025-12-05 12:04:42.773034494 +0000 UTC m=+967.265749217" observedRunningTime="2025-12-05 12:04:43.214424556 +0000 UTC m=+967.707139289" watchObservedRunningTime="2025-12-05 12:04:43.218052285 +0000 UTC m=+967.710767008" Dec 05 12:04:43 crc kubenswrapper[4763]: I1205 12:04:43.563239 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.170210 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jp2ck"] Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.170934 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.180234 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jp2ck"] Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.236374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8hmn\" (UniqueName: \"kubernetes.io/projected/522dda98-66e1-4ced-b504-e957eb00cda2-kube-api-access-c8hmn\") pod \"openstack-operator-index-jp2ck\" (UID: \"522dda98-66e1-4ced-b504-e957eb00cda2\") " pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.337563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8hmn\" (UniqueName: \"kubernetes.io/projected/522dda98-66e1-4ced-b504-e957eb00cda2-kube-api-access-c8hmn\") pod \"openstack-operator-index-jp2ck\" (UID: \"522dda98-66e1-4ced-b504-e957eb00cda2\") " pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.357246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8hmn\" (UniqueName: \"kubernetes.io/projected/522dda98-66e1-4ced-b504-e957eb00cda2-kube-api-access-c8hmn\") pod \"openstack-operator-index-jp2ck\" (UID: \"522dda98-66e1-4ced-b504-e957eb00cda2\") " pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.489842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:44 crc kubenswrapper[4763]: I1205 12:04:44.903196 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jp2ck"] Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.207947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jp2ck" event={"ID":"522dda98-66e1-4ced-b504-e957eb00cda2","Type":"ContainerStarted","Data":"e24d9b3cb906f3d1b8e7117e02d8e8f3ea4e0d89e96ea48e27c268fbdc288d5a"} Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.208038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jp2ck" event={"ID":"522dda98-66e1-4ced-b504-e957eb00cda2","Type":"ContainerStarted","Data":"851634de57f959d791ed35d9baec00ab29a1aa600b286c6f97633e4ed4ff2567"} Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.208128 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mgzr5" podUID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" containerName="registry-server" containerID="cri-o://5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f" gracePeriod=2 Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.305722 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jp2ck" podStartSLOduration=1.246913142 podStartE2EDuration="1.305696074s" podCreationTimestamp="2025-12-05 12:04:44 +0000 UTC" firstStartedPulling="2025-12-05 12:04:44.924712638 +0000 UTC m=+969.417427361" lastFinishedPulling="2025-12-05 12:04:44.98349557 +0000 UTC m=+969.476210293" observedRunningTime="2025-12-05 12:04:45.301859229 +0000 UTC m=+969.794573962" watchObservedRunningTime="2025-12-05 12:04:45.305696074 +0000 UTC m=+969.798410797" Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.682945 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.759310 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7csk9\" (UniqueName: \"kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9\") pod \"b7b8a187-cdfd-4966-92ba-9d535e9f365e\" (UID: \"b7b8a187-cdfd-4966-92ba-9d535e9f365e\") " Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.765840 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9" (OuterVolumeSpecName: "kube-api-access-7csk9") pod "b7b8a187-cdfd-4966-92ba-9d535e9f365e" (UID: "b7b8a187-cdfd-4966-92ba-9d535e9f365e"). InnerVolumeSpecName "kube-api-access-7csk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:04:45 crc kubenswrapper[4763]: I1205 12:04:45.861613 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7csk9\" (UniqueName: \"kubernetes.io/projected/b7b8a187-cdfd-4966-92ba-9d535e9f365e-kube-api-access-7csk9\") on node \"crc\" DevicePath \"\"" Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.216302 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" containerID="5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f" exitCode=0 Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.216367 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mgzr5" Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.216427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mgzr5" event={"ID":"b7b8a187-cdfd-4966-92ba-9d535e9f365e","Type":"ContainerDied","Data":"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f"} Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.216486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mgzr5" event={"ID":"b7b8a187-cdfd-4966-92ba-9d535e9f365e","Type":"ContainerDied","Data":"c1c0d74f4c1fd0234ce0d7a288e0fd224d4f6484b5a063eb342404a75a43ab73"} Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.216505 4763 scope.go:117] "RemoveContainer" containerID="5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f" Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.239923 4763 scope.go:117] "RemoveContainer" containerID="5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f" Dec 05 12:04:46 crc kubenswrapper[4763]: E1205 12:04:46.240795 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f\": container with ID starting with 5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f not found: ID does not exist" containerID="5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f" Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.240838 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f"} err="failed to get container status \"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f\": rpc error: code = NotFound desc = could not find container \"5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f\": container with ID starting with 5fed6851504beb5620445c6b5f69e085692ab38ad7b669e5f3ff1ed6ac40c78f not found: ID does not exist" Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.240998 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:46 crc kubenswrapper[4763]: I1205 12:04:46.245935 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mgzr5"] Dec 05 12:04:47 crc kubenswrapper[4763]: I1205 12:04:47.790503 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" path="/var/lib/kubelet/pods/b7b8a187-cdfd-4966-92ba-9d535e9f365e/volumes" Dec 05 12:04:54 crc kubenswrapper[4763]: I1205 12:04:54.490672 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:54 crc kubenswrapper[4763]: I1205 12:04:54.491274 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:54 crc kubenswrapper[4763]: I1205 12:04:54.512987 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:04:55 crc kubenswrapper[4763]: I1205 12:04:55.313266 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jp2ck" Dec 05 12:05:07 crc kubenswrapper[4763]: I1205 12:05:07.543693 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:05:07 crc kubenswrapper[4763]: I1205 12:05:07.553389 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:05:07 crc kubenswrapper[4763]: I1205 12:05:07.553461 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:05:07 crc kubenswrapper[4763]: I1205 12:05:07.554639 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:05:07 crc kubenswrapper[4763]: I1205 12:05:07.554708 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77" gracePeriod=600 Dec 05 12:05:08 crc kubenswrapper[4763]: E1205 12:05:08.167548 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96338136_6831_49d0_9eb9_77d1205c6afb.slice/crio-conmon-d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77.scope\": RecentStats: unable to find data in memory cache]" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.367581 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77" exitCode=0 Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.367637 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77"} Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.367675 4763 scope.go:117] "RemoveContainer" containerID="0c4feb1d4ef447a6746967b0703364de0643a9adc93a83a07473a63b712b66d6" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.427562 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx"] Dec 05 12:05:08 crc kubenswrapper[4763]: E1205 12:05:08.428017 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" containerName="registry-server" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.428114 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" containerName="registry-server" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.428269 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b8a187-cdfd-4966-92ba-9d535e9f365e" containerName="registry-server" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.429136 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.436998 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx"] Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.439661 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4xbqg" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.451671 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.451919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.452075 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc2fh\" (UniqueName: \"kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.552880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc2fh\" (UniqueName: \"kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.552983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.553005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.553596 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.553633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.579873 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc2fh\" (UniqueName: \"kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh\") pod \"10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:08 crc kubenswrapper[4763]: I1205 12:05:08.750619 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:09 crc kubenswrapper[4763]: I1205 12:05:09.267814 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx"] Dec 05 12:05:09 crc kubenswrapper[4763]: I1205 12:05:09.375914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2"} Dec 05 12:05:09 crc kubenswrapper[4763]: I1205 12:05:09.377941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" event={"ID":"262113c6-3029-4c0d-8279-e1454a535c24","Type":"ContainerStarted","Data":"8f572b720f37921a753fdcef4d2e5c18349f531b0ee8640db524dbf28f1d3bde"} Dec 05 12:05:10 crc kubenswrapper[4763]: I1205 12:05:10.385991 4763 generic.go:334] "Generic (PLEG): container finished" podID="262113c6-3029-4c0d-8279-e1454a535c24" containerID="af495730a323edb94b08c512e40192ac376eab37444b2a70a21a17b872218bf8" exitCode=0 Dec 05 12:05:10 crc kubenswrapper[4763]: I1205 12:05:10.386069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" event={"ID":"262113c6-3029-4c0d-8279-e1454a535c24","Type":"ContainerDied","Data":"af495730a323edb94b08c512e40192ac376eab37444b2a70a21a17b872218bf8"} Dec 05 12:05:12 crc kubenswrapper[4763]: I1205 12:05:12.401994 4763 generic.go:334] "Generic (PLEG): container finished" podID="262113c6-3029-4c0d-8279-e1454a535c24" containerID="fca79fe5a3ab02a31b5ba2221d5c6c8d4b1db9b7ee70182709dc3fd4f94d2e6e" exitCode=0 Dec 05 12:05:12 crc kubenswrapper[4763]: I1205 12:05:12.402167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" event={"ID":"262113c6-3029-4c0d-8279-e1454a535c24","Type":"ContainerDied","Data":"fca79fe5a3ab02a31b5ba2221d5c6c8d4b1db9b7ee70182709dc3fd4f94d2e6e"} Dec 05 12:05:13 crc kubenswrapper[4763]: I1205 12:05:13.412539 4763 generic.go:334] "Generic (PLEG): container finished" podID="262113c6-3029-4c0d-8279-e1454a535c24" containerID="7b354f18d2e82eb863d6a02fa17f33af009e2d54500160217cdd7e2b91bcf466" exitCode=0 Dec 05 12:05:13 crc kubenswrapper[4763]: I1205 12:05:13.412585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" event={"ID":"262113c6-3029-4c0d-8279-e1454a535c24","Type":"ContainerDied","Data":"7b354f18d2e82eb863d6a02fa17f33af009e2d54500160217cdd7e2b91bcf466"} Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.717204 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.839120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle\") pod \"262113c6-3029-4c0d-8279-e1454a535c24\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.839191 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc2fh\" (UniqueName: \"kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh\") pod \"262113c6-3029-4c0d-8279-e1454a535c24\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.839235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util\") pod \"262113c6-3029-4c0d-8279-e1454a535c24\" (UID: \"262113c6-3029-4c0d-8279-e1454a535c24\") " Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.840363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle" (OuterVolumeSpecName: "bundle") pod "262113c6-3029-4c0d-8279-e1454a535c24" (UID: "262113c6-3029-4c0d-8279-e1454a535c24"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.844727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh" (OuterVolumeSpecName: "kube-api-access-xc2fh") pod "262113c6-3029-4c0d-8279-e1454a535c24" (UID: "262113c6-3029-4c0d-8279-e1454a535c24"). InnerVolumeSpecName "kube-api-access-xc2fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.929189 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util" (OuterVolumeSpecName: "util") pod "262113c6-3029-4c0d-8279-e1454a535c24" (UID: "262113c6-3029-4c0d-8279-e1454a535c24"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.940687 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.940733 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc2fh\" (UniqueName: \"kubernetes.io/projected/262113c6-3029-4c0d-8279-e1454a535c24-kube-api-access-xc2fh\") on node \"crc\" DevicePath \"\"" Dec 05 12:05:14 crc kubenswrapper[4763]: I1205 12:05:14.940743 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/262113c6-3029-4c0d-8279-e1454a535c24-util\") on node \"crc\" DevicePath \"\"" Dec 05 12:05:15 crc kubenswrapper[4763]: I1205 12:05:15.440983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" event={"ID":"262113c6-3029-4c0d-8279-e1454a535c24","Type":"ContainerDied","Data":"8f572b720f37921a753fdcef4d2e5c18349f531b0ee8640db524dbf28f1d3bde"} Dec 05 12:05:15 crc kubenswrapper[4763]: I1205 12:05:15.441022 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx" Dec 05 12:05:15 crc kubenswrapper[4763]: I1205 12:05:15.441034 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f572b720f37921a753fdcef4d2e5c18349f531b0ee8640db524dbf28f1d3bde" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.787317 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5"] Dec 05 12:05:16 crc kubenswrapper[4763]: E1205 12:05:16.787567 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="util" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.787580 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="util" Dec 05 12:05:16 crc kubenswrapper[4763]: E1205 12:05:16.787596 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="extract" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.787602 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="extract" Dec 05 12:05:16 crc kubenswrapper[4763]: E1205 12:05:16.787611 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="pull" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.787617 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="pull" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.787720 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="262113c6-3029-4c0d-8279-e1454a535c24" containerName="extract" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.788193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.791852 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-tvgth" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.870065 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsmc\" (UniqueName: \"kubernetes.io/projected/9b87ef42-73e9-40c4-a64b-381de978398c-kube-api-access-tzsmc\") pod \"openstack-operator-controller-operator-654b7bd4cc-79gh5\" (UID: \"9b87ef42-73e9-40c4-a64b-381de978398c\") " pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.871783 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5"] Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.971127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsmc\" (UniqueName: \"kubernetes.io/projected/9b87ef42-73e9-40c4-a64b-381de978398c-kube-api-access-tzsmc\") pod \"openstack-operator-controller-operator-654b7bd4cc-79gh5\" (UID: \"9b87ef42-73e9-40c4-a64b-381de978398c\") " pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:16 crc kubenswrapper[4763]: I1205 12:05:16.987997 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsmc\" (UniqueName: \"kubernetes.io/projected/9b87ef42-73e9-40c4-a64b-381de978398c-kube-api-access-tzsmc\") pod \"openstack-operator-controller-operator-654b7bd4cc-79gh5\" (UID: \"9b87ef42-73e9-40c4-a64b-381de978398c\") " pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:17 crc kubenswrapper[4763]: I1205 12:05:17.104626 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:17 crc kubenswrapper[4763]: I1205 12:05:17.414062 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5"] Dec 05 12:05:17 crc kubenswrapper[4763]: W1205 12:05:17.419464 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b87ef42_73e9_40c4_a64b_381de978398c.slice/crio-c5d784afe978bf9f18c8c59f815ce7d8d5905f2fb017fac62dcc983cc9108cfd WatchSource:0}: Error finding container c5d784afe978bf9f18c8c59f815ce7d8d5905f2fb017fac62dcc983cc9108cfd: Status 404 returned error can't find the container with id c5d784afe978bf9f18c8c59f815ce7d8d5905f2fb017fac62dcc983cc9108cfd Dec 05 12:05:17 crc kubenswrapper[4763]: I1205 12:05:17.452318 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" event={"ID":"9b87ef42-73e9-40c4-a64b-381de978398c","Type":"ContainerStarted","Data":"c5d784afe978bf9f18c8c59f815ce7d8d5905f2fb017fac62dcc983cc9108cfd"} Dec 05 12:05:24 crc kubenswrapper[4763]: I1205 12:05:24.499037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" event={"ID":"9b87ef42-73e9-40c4-a64b-381de978398c","Type":"ContainerStarted","Data":"b43a0bc566e29aba0da924938f6f392a4d58041f4535bcfdf0c5708b93d785e9"} Dec 05 12:05:24 crc kubenswrapper[4763]: I1205 12:05:24.499803 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:24 crc kubenswrapper[4763]: I1205 12:05:24.527404 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" podStartSLOduration=2.57361314 podStartE2EDuration="8.527389117s" podCreationTimestamp="2025-12-05 12:05:16 +0000 UTC" firstStartedPulling="2025-12-05 12:05:17.422044718 +0000 UTC m=+1001.914759451" lastFinishedPulling="2025-12-05 12:05:23.375820705 +0000 UTC m=+1007.868535428" observedRunningTime="2025-12-05 12:05:24.526232378 +0000 UTC m=+1009.018947111" watchObservedRunningTime="2025-12-05 12:05:24.527389117 +0000 UTC m=+1009.020103840" Dec 05 12:05:37 crc kubenswrapper[4763]: I1205 12:05:37.108974 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-654b7bd4cc-79gh5" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.138673 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.142599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.147467 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.148202 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ccjhp" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.156753 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.158166 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.159701 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.164425 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qxwqk" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.194649 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.195747 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.200633 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-fxk26" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.202532 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.203833 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.205514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xjxqw" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.243834 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.245135 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.249241 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cwnk9" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.251366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.264585 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.265921 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.269587 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-n48tz" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.274050 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.282677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.304300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8fn\" (UniqueName: \"kubernetes.io/projected/f9a5212c-2ddb-4e82-818e-5102fb3c5ee2-kube-api-access-hh8fn\") pod \"designate-operator-controller-manager-78b4bc895b-8gw8h\" (UID: \"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.305016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vv9f\" (UniqueName: \"kubernetes.io/projected/94ae68ae-93ae-43a6-89fa-5b2301808793-kube-api-access-6vv9f\") pod \"glance-operator-controller-manager-77987cd8cd-77g97\" (UID: \"94ae68ae-93ae-43a6-89fa-5b2301808793\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.311187 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmlk\" (UniqueName: \"kubernetes.io/projected/e97d9ee8-0c07-486a-84f1-dabddb037a8b-kube-api-access-dnmlk\") pod \"cinder-operator-controller-manager-859b6ccc6-w8w7f\" (UID: \"e97d9ee8-0c07-486a-84f1-dabddb037a8b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.311404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtjx\" (UniqueName: \"kubernetes.io/projected/d7dd9586-7cc5-42f0-87a8-3a8c54557b21-kube-api-access-9gtjx\") pod \"barbican-operator-controller-manager-7d9dfd778-f92rg\" (UID: \"d7dd9586-7cc5-42f0-87a8-3a8c54557b21\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.306681 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.313488 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.323157 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gpj4x" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.323350 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.327725 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.332141 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.361717 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.362871 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.369078 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wdzkl" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.380830 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.398468 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.399862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.406288 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.407385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.407592 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wrflq" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9s76\" (UniqueName: \"kubernetes.io/projected/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-kube-api-access-x9s76\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8fn\" (UniqueName: \"kubernetes.io/projected/f9a5212c-2ddb-4e82-818e-5102fb3c5ee2-kube-api-access-hh8fn\") pod \"designate-operator-controller-manager-78b4bc895b-8gw8h\" (UID: \"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgck\" (UniqueName: \"kubernetes.io/projected/42f31714-9ede-4c48-b611-028a79374fad-kube-api-access-cjgck\") pod \"heat-operator-controller-manager-5f64f6f8bb-mq9f4\" (UID: \"42f31714-9ede-4c48-b611-028a79374fad\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415169 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vv9f\" (UniqueName: \"kubernetes.io/projected/94ae68ae-93ae-43a6-89fa-5b2301808793-kube-api-access-6vv9f\") pod \"glance-operator-controller-manager-77987cd8cd-77g97\" (UID: \"94ae68ae-93ae-43a6-89fa-5b2301808793\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415196 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmlk\" (UniqueName: \"kubernetes.io/projected/e97d9ee8-0c07-486a-84f1-dabddb037a8b-kube-api-access-dnmlk\") pod \"cinder-operator-controller-manager-859b6ccc6-w8w7f\" (UID: \"e97d9ee8-0c07-486a-84f1-dabddb037a8b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc88h\" (UniqueName: \"kubernetes.io/projected/2bed16d5-ec79-4ad7-8984-b965fa568dc6-kube-api-access-qc88h\") pod \"horizon-operator-controller-manager-68c6d99b8f-gbf44\" (UID: \"2bed16d5-ec79-4ad7-8984-b965fa568dc6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415254 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtjx\" (UniqueName: \"kubernetes.io/projected/d7dd9586-7cc5-42f0-87a8-3a8c54557b21-kube-api-access-9gtjx\") pod \"barbican-operator-controller-manager-7d9dfd778-f92rg\" (UID: \"d7dd9586-7cc5-42f0-87a8-3a8c54557b21\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.415299 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9qtsj" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.435826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.443825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vv9f\" (UniqueName: \"kubernetes.io/projected/94ae68ae-93ae-43a6-89fa-5b2301808793-kube-api-access-6vv9f\") pod \"glance-operator-controller-manager-77987cd8cd-77g97\" (UID: \"94ae68ae-93ae-43a6-89fa-5b2301808793\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.448511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8fn\" (UniqueName: \"kubernetes.io/projected/f9a5212c-2ddb-4e82-818e-5102fb3c5ee2-kube-api-access-hh8fn\") pod \"designate-operator-controller-manager-78b4bc895b-8gw8h\" (UID: \"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.448578 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.453448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmlk\" (UniqueName: \"kubernetes.io/projected/e97d9ee8-0c07-486a-84f1-dabddb037a8b-kube-api-access-dnmlk\") pod \"cinder-operator-controller-manager-859b6ccc6-w8w7f\" (UID: \"e97d9ee8-0c07-486a-84f1-dabddb037a8b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.480399 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.481591 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtjx\" (UniqueName: \"kubernetes.io/projected/d7dd9586-7cc5-42f0-87a8-3a8c54557b21-kube-api-access-9gtjx\") pod \"barbican-operator-controller-manager-7d9dfd778-f92rg\" (UID: \"d7dd9586-7cc5-42f0-87a8-3a8c54557b21\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.481984 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.484419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gx56c" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.497857 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.503897 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.513902 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.515023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.518849 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hhfrb" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.519843 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.519896 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc88h\" (UniqueName: \"kubernetes.io/projected/2bed16d5-ec79-4ad7-8984-b965fa568dc6-kube-api-access-qc88h\") pod \"horizon-operator-controller-manager-68c6d99b8f-gbf44\" (UID: \"2bed16d5-ec79-4ad7-8984-b965fa568dc6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.519945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsg8d\" (UniqueName: \"kubernetes.io/projected/6520d187-9c6c-4b0e-b0c9-27e23db84f4c-kube-api-access-fsg8d\") pod \"keystone-operator-controller-manager-7765d96ddf-gx4vn\" (UID: \"6520d187-9c6c-4b0e-b0c9-27e23db84f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.519989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjlw\" (UniqueName: \"kubernetes.io/projected/ed1b8d49-d742-4493-bb7e-856b4108fb88-kube-api-access-hcjlw\") pod \"ironic-operator-controller-manager-6c548fd776-2bgt8\" (UID: \"ed1b8d49-d742-4493-bb7e-856b4108fb88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfmw\" (UniqueName: \"kubernetes.io/projected/4faed118-8b9d-4adb-8f86-6a6be8061bce-kube-api-access-vqfmw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vqmk4\" (UID: \"4faed118-8b9d-4adb-8f86-6a6be8061bce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9s76\" (UniqueName: \"kubernetes.io/projected/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-kube-api-access-x9s76\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520081 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gtn5\" (UniqueName: \"kubernetes.io/projected/eb3c8b38-a863-42d0-b7d8-03231971e4ce-kube-api-access-2gtn5\") pod \"manila-operator-controller-manager-7c79b5df47-sq9wf\" (UID: \"eb3c8b38-a863-42d0-b7d8-03231971e4ce\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgck\" (UniqueName: \"kubernetes.io/projected/42f31714-9ede-4c48-b611-028a79374fad-kube-api-access-cjgck\") pod \"heat-operator-controller-manager-5f64f6f8bb-mq9f4\" (UID: \"42f31714-9ede-4c48-b611-028a79374fad\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:05:58 crc kubenswrapper[4763]: E1205 12:05:58.520401 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:05:58 crc kubenswrapper[4763]: E1205 12:05:58.520457 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:05:59.020441302 +0000 UTC m=+1043.513156025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.520586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2nz4\" (UniqueName: \"kubernetes.io/projected/cd42325e-d26d-4cb6-b8dd-f75dc86e7568-kube-api-access-g2nz4\") pod \"mariadb-operator-controller-manager-56bbcc9d85-b9w6n\" (UID: \"cd42325e-d26d-4cb6-b8dd-f75dc86e7568\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.523837 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.543778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc88h\" (UniqueName: \"kubernetes.io/projected/2bed16d5-ec79-4ad7-8984-b965fa568dc6-kube-api-access-qc88h\") pod \"horizon-operator-controller-manager-68c6d99b8f-gbf44\" (UID: \"2bed16d5-ec79-4ad7-8984-b965fa568dc6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.546374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgck\" (UniqueName: \"kubernetes.io/projected/42f31714-9ede-4c48-b611-028a79374fad-kube-api-access-cjgck\") pod \"heat-operator-controller-manager-5f64f6f8bb-mq9f4\" (UID: \"42f31714-9ede-4c48-b611-028a79374fad\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.546651 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.548705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9s76\" (UniqueName: \"kubernetes.io/projected/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-kube-api-access-x9s76\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.560887 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.562787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.566162 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5c2tn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.571637 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.577751 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.579454 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.604840 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.618652 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.624326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gtn5\" (UniqueName: \"kubernetes.io/projected/eb3c8b38-a863-42d0-b7d8-03231971e4ce-kube-api-access-2gtn5\") pod \"manila-operator-controller-manager-7c79b5df47-sq9wf\" (UID: \"eb3c8b38-a863-42d0-b7d8-03231971e4ce\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.624498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2nz4\" (UniqueName: \"kubernetes.io/projected/cd42325e-d26d-4cb6-b8dd-f75dc86e7568-kube-api-access-g2nz4\") pod \"mariadb-operator-controller-manager-56bbcc9d85-b9w6n\" (UID: \"cd42325e-d26d-4cb6-b8dd-f75dc86e7568\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.624669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsg8d\" (UniqueName: \"kubernetes.io/projected/6520d187-9c6c-4b0e-b0c9-27e23db84f4c-kube-api-access-fsg8d\") pod \"keystone-operator-controller-manager-7765d96ddf-gx4vn\" (UID: \"6520d187-9c6c-4b0e-b0c9-27e23db84f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.624856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjlw\" (UniqueName: \"kubernetes.io/projected/ed1b8d49-d742-4493-bb7e-856b4108fb88-kube-api-access-hcjlw\") pod \"ironic-operator-controller-manager-6c548fd776-2bgt8\" (UID: \"ed1b8d49-d742-4493-bb7e-856b4108fb88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.624919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfmw\" (UniqueName: \"kubernetes.io/projected/4faed118-8b9d-4adb-8f86-6a6be8061bce-kube-api-access-vqfmw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vqmk4\" (UID: \"4faed118-8b9d-4adb-8f86-6a6be8061bce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.630084 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-b8wf9" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.644370 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.656711 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjlw\" (UniqueName: \"kubernetes.io/projected/ed1b8d49-d742-4493-bb7e-856b4108fb88-kube-api-access-hcjlw\") pod \"ironic-operator-controller-manager-6c548fd776-2bgt8\" (UID: \"ed1b8d49-d742-4493-bb7e-856b4108fb88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.656727 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsg8d\" (UniqueName: \"kubernetes.io/projected/6520d187-9c6c-4b0e-b0c9-27e23db84f4c-kube-api-access-fsg8d\") pod \"keystone-operator-controller-manager-7765d96ddf-gx4vn\" (UID: \"6520d187-9c6c-4b0e-b0c9-27e23db84f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.657458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfmw\" (UniqueName: \"kubernetes.io/projected/4faed118-8b9d-4adb-8f86-6a6be8061bce-kube-api-access-vqfmw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vqmk4\" (UID: \"4faed118-8b9d-4adb-8f86-6a6be8061bce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.659123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gtn5\" (UniqueName: \"kubernetes.io/projected/eb3c8b38-a863-42d0-b7d8-03231971e4ce-kube-api-access-2gtn5\") pod \"manila-operator-controller-manager-7c79b5df47-sq9wf\" (UID: \"eb3c8b38-a863-42d0-b7d8-03231971e4ce\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.659905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2nz4\" (UniqueName: \"kubernetes.io/projected/cd42325e-d26d-4cb6-b8dd-f75dc86e7568-kube-api-access-g2nz4\") pod \"mariadb-operator-controller-manager-56bbcc9d85-b9w6n\" (UID: \"cd42325e-d26d-4cb6-b8dd-f75dc86e7568\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.660090 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.661917 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.664865 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.666259 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.666494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.666721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-x59bh" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.669835 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-v94rm" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.670374 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.678018 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.690863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.694672 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4xnz8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.700330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.708345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.719807 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.725357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bh97\" (UniqueName: \"kubernetes.io/projected/fcc46489-05d5-4219-9e45-6ca25f25900f-kube-api-access-9bh97\") pod \"octavia-operator-controller-manager-998648c74-6zk6f\" (UID: \"fcc46489-05d5-4219-9e45-6ca25f25900f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scncw\" (UniqueName: \"kubernetes.io/projected/63e1e64f-8414-4da8-8a32-5f0a0041c5ff-kube-api-access-scncw\") pod \"ovn-operator-controller-manager-b6456fdb6-mch9f\" (UID: \"63e1e64f-8414-4da8-8a32-5f0a0041c5ff\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735744 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78jv5\" (UniqueName: \"kubernetes.io/projected/378cb9d9-8010-4dcf-9297-5e4f0679086e-kube-api-access-78jv5\") pod \"nova-operator-controller-manager-697bc559fc-mnthh\" (UID: \"378cb9d9-8010-4dcf-9297-5e4f0679086e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735856 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whb44\" (UniqueName: \"kubernetes.io/projected/a05e3d8d-f58a-44f0-b3c9-e212cdcec438-kube-api-access-whb44\") pod \"placement-operator-controller-manager-78f8948974-gzx9n\" (UID: \"a05e3d8d-f58a-44f0-b3c9-e212cdcec438\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.735956 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l98c\" (UniqueName: \"kubernetes.io/projected/3fdf0ecb-215d-4a02-8053-169fcbfefa50-kube-api-access-9l98c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.742407 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.743785 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.744130 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.746666 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-c9sm4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.779150 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.780362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.790222 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.792583 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.808528 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xgcbn" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.812103 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.834285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843678 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whb44\" (UniqueName: \"kubernetes.io/projected/a05e3d8d-f58a-44f0-b3c9-e212cdcec438-kube-api-access-whb44\") pod \"placement-operator-controller-manager-78f8948974-gzx9n\" (UID: \"a05e3d8d-f58a-44f0-b3c9-e212cdcec438\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l98c\" (UniqueName: \"kubernetes.io/projected/3fdf0ecb-215d-4a02-8053-169fcbfefa50-kube-api-access-9l98c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843788 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j97ls\" (UniqueName: \"kubernetes.io/projected/b64b19c9-3601-4790-addf-c9a32f6c29fe-kube-api-access-j97ls\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xkl6w\" (UID: \"b64b19c9-3601-4790-addf-c9a32f6c29fe\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqgf\" (UniqueName: \"kubernetes.io/projected/2024cb36-8175-4993-bd5b-a57a8fb8416c-kube-api-access-cdqgf\") pod \"swift-operator-controller-manager-5f8c65bbfc-phnl7\" (UID: \"2024cb36-8175-4993-bd5b-a57a8fb8416c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bh97\" (UniqueName: \"kubernetes.io/projected/fcc46489-05d5-4219-9e45-6ca25f25900f-kube-api-access-9bh97\") pod \"octavia-operator-controller-manager-998648c74-6zk6f\" (UID: \"fcc46489-05d5-4219-9e45-6ca25f25900f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scncw\" (UniqueName: \"kubernetes.io/projected/63e1e64f-8414-4da8-8a32-5f0a0041c5ff-kube-api-access-scncw\") pod \"ovn-operator-controller-manager-b6456fdb6-mch9f\" (UID: \"63e1e64f-8414-4da8-8a32-5f0a0041c5ff\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.843938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78jv5\" (UniqueName: \"kubernetes.io/projected/378cb9d9-8010-4dcf-9297-5e4f0679086e-kube-api-access-78jv5\") pod \"nova-operator-controller-manager-697bc559fc-mnthh\" (UID: \"378cb9d9-8010-4dcf-9297-5e4f0679086e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:05:58 crc kubenswrapper[4763]: E1205 12:05:58.844171 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:05:58 crc kubenswrapper[4763]: E1205 12:05:58.844220 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:05:59.344206429 +0000 UTC m=+1043.836921152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.850021 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ht96c"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.851223 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.857786 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ht96c"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.861210 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zjc5k" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.879441 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.879502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l98c\" (UniqueName: \"kubernetes.io/projected/3fdf0ecb-215d-4a02-8053-169fcbfefa50-kube-api-access-9l98c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.881132 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.881191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78jv5\" (UniqueName: \"kubernetes.io/projected/378cb9d9-8010-4dcf-9297-5e4f0679086e-kube-api-access-78jv5\") pod \"nova-operator-controller-manager-697bc559fc-mnthh\" (UID: \"378cb9d9-8010-4dcf-9297-5e4f0679086e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.881657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whb44\" (UniqueName: \"kubernetes.io/projected/a05e3d8d-f58a-44f0-b3c9-e212cdcec438-kube-api-access-whb44\") pod \"placement-operator-controller-manager-78f8948974-gzx9n\" (UID: \"a05e3d8d-f58a-44f0-b3c9-e212cdcec438\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.884793 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scncw\" (UniqueName: \"kubernetes.io/projected/63e1e64f-8414-4da8-8a32-5f0a0041c5ff-kube-api-access-scncw\") pod \"ovn-operator-controller-manager-b6456fdb6-mch9f\" (UID: \"63e1e64f-8414-4da8-8a32-5f0a0041c5ff\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.884811 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-crqw8" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.887963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bh97\" (UniqueName: \"kubernetes.io/projected/fcc46489-05d5-4219-9e45-6ca25f25900f-kube-api-access-9bh97\") pod \"octavia-operator-controller-manager-998648c74-6zk6f\" (UID: \"fcc46489-05d5-4219-9e45-6ca25f25900f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.893880 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.921896 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.931546 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.939858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.940728 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.944822 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.944906 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j97ls\" (UniqueName: \"kubernetes.io/projected/b64b19c9-3601-4790-addf-c9a32f6c29fe-kube-api-access-j97ls\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xkl6w\" (UID: \"b64b19c9-3601-4790-addf-c9a32f6c29fe\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.944951 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwn78\" (UniqueName: \"kubernetes.io/projected/0a36f8ad-7e41-4005-a42e-47b9a30af62f-kube-api-access-jwn78\") pod \"test-operator-controller-manager-5854674fcc-ht96c\" (UID: \"0a36f8ad-7e41-4005-a42e-47b9a30af62f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.944985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqgf\" (UniqueName: \"kubernetes.io/projected/2024cb36-8175-4993-bd5b-a57a8fb8416c-kube-api-access-cdqgf\") pod \"swift-operator-controller-manager-5f8c65bbfc-phnl7\" (UID: \"2024cb36-8175-4993-bd5b-a57a8fb8416c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.945013 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.945017 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jr9\" (UniqueName: \"kubernetes.io/projected/36e19ef2-df0d-43ca-8477-f1cec2182b45-kube-api-access-m8jr9\") pod \"watcher-operator-controller-manager-66974974bb-mjwrw\" (UID: \"36e19ef2-df0d-43ca-8477-f1cec2182b45\") " pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.945166 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4gbv2" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.945702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.974981 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm"] Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.998055 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:05:58 crc kubenswrapper[4763]: I1205 12:05:58.998666 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j97ls\" (UniqueName: \"kubernetes.io/projected/b64b19c9-3601-4790-addf-c9a32f6c29fe-kube-api-access-j97ls\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xkl6w\" (UID: \"b64b19c9-3601-4790-addf-c9a32f6c29fe\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.000845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqgf\" (UniqueName: \"kubernetes.io/projected/2024cb36-8175-4993-bd5b-a57a8fb8416c-kube-api-access-cdqgf\") pod \"swift-operator-controller-manager-5f8c65bbfc-phnl7\" (UID: \"2024cb36-8175-4993-bd5b-a57a8fb8416c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.026484 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm"] Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.041454 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.048005 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fx2m6" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049555 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jr9\" (UniqueName: \"kubernetes.io/projected/36e19ef2-df0d-43ca-8477-f1cec2182b45-kube-api-access-m8jr9\") pod \"watcher-operator-controller-manager-66974974bb-mjwrw\" (UID: \"36e19ef2-df0d-43ca-8477-f1cec2182b45\") " pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49dc\" (UniqueName: \"kubernetes.io/projected/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-kube-api-access-r49dc\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.049732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwn78\" (UniqueName: \"kubernetes.io/projected/0a36f8ad-7e41-4005-a42e-47b9a30af62f-kube-api-access-jwn78\") pod \"test-operator-controller-manager-5854674fcc-ht96c\" (UID: \"0a36f8ad-7e41-4005-a42e-47b9a30af62f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.050198 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.050239 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:00.050225019 +0000 UTC m=+1044.542939742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.050649 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.067411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jr9\" (UniqueName: \"kubernetes.io/projected/36e19ef2-df0d-43ca-8477-f1cec2182b45-kube-api-access-m8jr9\") pod \"watcher-operator-controller-manager-66974974bb-mjwrw\" (UID: \"36e19ef2-df0d-43ca-8477-f1cec2182b45\") " pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.067682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwn78\" (UniqueName: \"kubernetes.io/projected/0a36f8ad-7e41-4005-a42e-47b9a30af62f-kube-api-access-jwn78\") pod \"test-operator-controller-manager-5854674fcc-ht96c\" (UID: \"0a36f8ad-7e41-4005-a42e-47b9a30af62f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.071640 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm"] Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.096934 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.151130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.151174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.151209 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vv8\" (UniqueName: \"kubernetes.io/projected/01d1c35a-adc3-4945-92b5-5921600cb826-kube-api-access-q9vv8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n8fzm\" (UID: \"01d1c35a-adc3-4945-92b5-5921600cb826\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.151277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49dc\" (UniqueName: \"kubernetes.io/projected/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-kube-api-access-r49dc\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.151310 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.151450 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:05:59.651415628 +0000 UTC m=+1044.144130351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.151716 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.151742 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:05:59.65173499 +0000 UTC m=+1044.144449713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.183578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49dc\" (UniqueName: \"kubernetes.io/projected/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-kube-api-access-r49dc\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.226034 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.252871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vv8\" (UniqueName: \"kubernetes.io/projected/01d1c35a-adc3-4945-92b5-5921600cb826-kube-api-access-q9vv8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n8fzm\" (UID: \"01d1c35a-adc3-4945-92b5-5921600cb826\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.253543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.271521 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.274431 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vv8\" (UniqueName: \"kubernetes.io/projected/01d1c35a-adc3-4945-92b5-5921600cb826-kube-api-access-q9vv8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n8fzm\" (UID: \"01d1c35a-adc3-4945-92b5-5921600cb826\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.301677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4"] Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.306950 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f"] Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.353886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.354122 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.354177 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:00.354160922 +0000 UTC m=+1044.846875645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.428134 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.533232 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.660919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.660968 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.661146 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.661203 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:00.661185559 +0000 UTC m=+1045.153900282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.661514 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: E1205 12:05:59.661551 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:00.661541433 +0000 UTC m=+1045.154256156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.762433 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" event={"ID":"42f31714-9ede-4c48-b611-028a79374fad","Type":"ContainerStarted","Data":"6bc372fe289dab117f51503752eb87b9f35d7dcaab035ac29d34e752bf96a17c"} Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.763232 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" event={"ID":"e97d9ee8-0c07-486a-84f1-dabddb037a8b","Type":"ContainerStarted","Data":"e106d352bd21fdbcbecd1e35ff2f8708c96fdbb1b4b27d5efdea6ba3766ee99f"} Dec 05 12:05:59 crc kubenswrapper[4763]: I1205 12:05:59.960713 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.067334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.067516 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.067585 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:02.067566489 +0000 UTC m=+1046.560281212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.374918 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.375090 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.375158 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:02.375139573 +0000 UTC m=+1046.867854296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.437055 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.498300 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.523646 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.557367 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.627694 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.640149 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.659323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.670307 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.686152 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.688526 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.688594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.688941 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.689015 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:02.688996649 +0000 UTC m=+1047.181711372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.689105 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.689232 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:02.689209514 +0000 UTC m=+1047.181924227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.695093 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.705161 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.711900 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh"] Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.712663 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whb44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-gzx9n_openstack-operators(a05e3d8d-f58a-44f0-b3c9-e212cdcec438): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.717923 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwn78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-ht96c_openstack-operators(0a36f8ad-7e41-4005-a42e-47b9a30af62f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.718086 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whb44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-gzx9n_openstack-operators(a05e3d8d-f58a-44f0-b3c9-e212cdcec438): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.719852 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9vv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-n8fzm_openstack-operators(01d1c35a-adc3-4945-92b5-5921600cb826): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.720263 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsg8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gx4vn_openstack-operators(6520d187-9c6c-4b0e-b0c9-27e23db84f4c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.720709 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.721580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podUID="01d1c35a-adc3-4945-92b5-5921600cb826" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.722097 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8jr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-66974974bb-mjwrw_openstack-operators(36e19ef2-df0d-43ca-8477-f1cec2182b45): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.722258 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc88h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gbf44_openstack-operators(2bed16d5-ec79-4ad7-8984-b965fa568dc6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.726602 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc88h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gbf44_openstack-operators(2bed16d5-ec79-4ad7-8984-b965fa568dc6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.726691 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwn78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-ht96c_openstack-operators(0a36f8ad-7e41-4005-a42e-47b9a30af62f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.726753 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8jr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-66974974bb-mjwrw_openstack-operators(36e19ef2-df0d-43ca-8477-f1cec2182b45): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.726827 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsg8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gx4vn_openstack-operators(6520d187-9c6c-4b0e-b0c9-27e23db84f4c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.728533 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podUID="6520d187-9c6c-4b0e-b0c9-27e23db84f4c" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.728599 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podUID="2bed16d5-ec79-4ad7-8984-b965fa568dc6" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.728625 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podUID="0a36f8ad-7e41-4005-a42e-47b9a30af62f" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.728644 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podUID="36e19ef2-df0d-43ca-8477-f1cec2182b45" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.734771 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.743819 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.747000 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ht96c"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.769538 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.784554 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm"] Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.789645 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" event={"ID":"cd42325e-d26d-4cb6-b8dd-f75dc86e7568","Type":"ContainerStarted","Data":"7a13c5bf877ffef05ee9c5d0d80ae71f0c80d4b4d574ffcb303566e21f9ecdef"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.807872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" event={"ID":"0a36f8ad-7e41-4005-a42e-47b9a30af62f","Type":"ContainerStarted","Data":"52e3afb348c77f88575132e3173b43ec9cefec67465ca024278c34b1db8c2bc6"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.809515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" event={"ID":"94ae68ae-93ae-43a6-89fa-5b2301808793","Type":"ContainerStarted","Data":"44dec782890d244ffafa7f0ecacd8a74bbc6588d0cad773355be5c896b6c8ab5"} Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.810292 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podUID="0a36f8ad-7e41-4005-a42e-47b9a30af62f" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.815479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" event={"ID":"2024cb36-8175-4993-bd5b-a57a8fb8416c","Type":"ContainerStarted","Data":"3f0698768ae5847c4f3afb816b77412c80d7271e67f885df069c909cb0cd1997"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.816444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" event={"ID":"a05e3d8d-f58a-44f0-b3c9-e212cdcec438","Type":"ContainerStarted","Data":"32a1644787799603df4500f66507e7844809369ffaa3df67c49812356a632b21"} Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.818109 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.818469 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" event={"ID":"63e1e64f-8414-4da8-8a32-5f0a0041c5ff","Type":"ContainerStarted","Data":"2cf896775ad380853e3a3b6c94dbe59724b046e60c7bf2a8980bcc94fdc59a3f"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.819847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" event={"ID":"4faed118-8b9d-4adb-8f86-6a6be8061bce","Type":"ContainerStarted","Data":"a5adf3f40e3a03392a4fb165da830b59797ee3f3073eeebf1476165d1c71ef75"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.823817 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" event={"ID":"b64b19c9-3601-4790-addf-c9a32f6c29fe","Type":"ContainerStarted","Data":"5a86258f54f27920d9c1a9b7e94380298c7a5453b05b82cdd8bd62ed6c9edeb3"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.826090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" event={"ID":"36e19ef2-df0d-43ca-8477-f1cec2182b45","Type":"ContainerStarted","Data":"84fa096281f51776eb2db57cf48695da156fff23d6b73a6f79845bd167291775"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.828206 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" event={"ID":"eb3c8b38-a863-42d0-b7d8-03231971e4ce","Type":"ContainerStarted","Data":"e3404e4f0ae55f3491614b902c11385ed844d4cf497c1e5e1ef3ab0fef443a7b"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.829904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" event={"ID":"6520d187-9c6c-4b0e-b0c9-27e23db84f4c","Type":"ContainerStarted","Data":"fac658f4ff9ae1187ed12a4895a440e60cfa4425519131b765f7e744ce462b8f"} Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.831970 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podUID="36e19ef2-df0d-43ca-8477-f1cec2182b45" Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.834589 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podUID="6520d187-9c6c-4b0e-b0c9-27e23db84f4c" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.835749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" event={"ID":"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2","Type":"ContainerStarted","Data":"a0279d759afdbf0c3e4cce9aa807a62b204748f29c513120a14e3638f54a6b3e"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.837523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" event={"ID":"01d1c35a-adc3-4945-92b5-5921600cb826","Type":"ContainerStarted","Data":"997e7fe8b593fab46551cedf39204b301a465a13a4dd1b12b54ce97458da046e"} Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.838991 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podUID="01d1c35a-adc3-4945-92b5-5921600cb826" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.840070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" event={"ID":"d7dd9586-7cc5-42f0-87a8-3a8c54557b21","Type":"ContainerStarted","Data":"5ed9073f33a68e0f2dfc37cd9af0643dec2779f683b6ec6b6cccb21f3a1846f6"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.841610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" event={"ID":"2bed16d5-ec79-4ad7-8984-b965fa568dc6","Type":"ContainerStarted","Data":"6701f1b20b40b3ba820ef7d58949d3b4c0a7c79d6e3a5e5c10377b2016020935"} Dec 05 12:06:00 crc kubenswrapper[4763]: E1205 12:06:00.845587 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podUID="2bed16d5-ec79-4ad7-8984-b965fa568dc6" Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.845744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" event={"ID":"378cb9d9-8010-4dcf-9297-5e4f0679086e","Type":"ContainerStarted","Data":"3f67088a6bd7ea797583d8d08296943a1ec7c187f9cf9bd540bfb91215fa544b"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.851278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" event={"ID":"fcc46489-05d5-4219-9e45-6ca25f25900f","Type":"ContainerStarted","Data":"5a209221be9a43b6264b4e08c048037f9984682c8dfea7ce069dd359ff5467b2"} Dec 05 12:06:00 crc kubenswrapper[4763]: I1205 12:06:00.855577 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" event={"ID":"ed1b8d49-d742-4493-bb7e-856b4108fb88","Type":"ContainerStarted","Data":"7f0b69786977c4d2f6226fe5c9cd974153737ba5002665e499860883d6c6c104"} Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.866429 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podUID="2bed16d5-ec79-4ad7-8984-b965fa568dc6" Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.866820 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podUID="36e19ef2-df0d-43ca-8477-f1cec2182b45" Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.867249 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podUID="01d1c35a-adc3-4945-92b5-5921600cb826" Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.867426 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.867735 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podUID="0a36f8ad-7e41-4005-a42e-47b9a30af62f" Dec 05 12:06:01 crc kubenswrapper[4763]: E1205 12:06:01.868708 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podUID="6520d187-9c6c-4b0e-b0c9-27e23db84f4c" Dec 05 12:06:02 crc kubenswrapper[4763]: I1205 12:06:02.129915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.130130 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.130213 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:06.130190292 +0000 UTC m=+1050.622905015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: I1205 12:06:02.434254 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.434405 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.434455 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:06.434438187 +0000 UTC m=+1050.927152900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: I1205 12:06:02.741681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:02 crc kubenswrapper[4763]: I1205 12:06:02.741743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.741857 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.741902 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.741928 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:06.741911524 +0000 UTC m=+1051.234626247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:06:02 crc kubenswrapper[4763]: E1205 12:06:02.741961 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:06.741943046 +0000 UTC m=+1051.234657849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: I1205 12:06:06.228090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.228247 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.228639 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:14.228618031 +0000 UTC m=+1058.721332754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: I1205 12:06:06.532335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.532567 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.532665 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:14.532643722 +0000 UTC m=+1059.025358445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: I1205 12:06:06.836070 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:06 crc kubenswrapper[4763]: I1205 12:06:06.836128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.836324 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.836369 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.836431 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:14.836410444 +0000 UTC m=+1059.329125157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:06:06 crc kubenswrapper[4763]: E1205 12:06:06.836467 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:14.836441996 +0000 UTC m=+1059.329156719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: I1205 12:06:14.323603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.324069 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.324447 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert podName:f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:30.324424003 +0000 UTC m=+1074.817138726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert") pod "infra-operator-controller-manager-57548d458d-p5jgv" (UID: "f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6") : secret "infra-operator-webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.432334 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.432570 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vqfmw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_openstack-operators(4faed118-8b9d-4adb-8f86-6a6be8061bce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:14 crc kubenswrapper[4763]: I1205 12:06:14.628106 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.628309 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.628353 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert podName:3fdf0ecb-215d-4a02-8053-169fcbfefa50 nodeName:}" failed. No retries permitted until 2025-12-05 12:06:30.628341275 +0000 UTC m=+1075.121055998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" (UID: "3fdf0ecb-215d-4a02-8053-169fcbfefa50") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: I1205 12:06:14.931729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:14 crc kubenswrapper[4763]: I1205 12:06:14.931802 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.931904 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.931976 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:30.931956668 +0000 UTC m=+1075.424671391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "metrics-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.932009 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 12:06:14 crc kubenswrapper[4763]: E1205 12:06:14.932044 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs podName:43f191ee-e0a3-4d9e-a63a-c9b7a626806f nodeName:}" failed. No retries permitted until 2025-12-05 12:06:30.932033593 +0000 UTC m=+1075.424748336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs") pod "openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" (UID: "43f191ee-e0a3-4d9e-a63a-c9b7a626806f") : secret "webhook-server-cert" not found Dec 05 12:06:16 crc kubenswrapper[4763]: E1205 12:06:16.814392 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 05 12:06:16 crc kubenswrapper[4763]: E1205 12:06:16.815115 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cdqgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-phnl7_openstack-operators(2024cb36-8175-4993-bd5b-a57a8fb8416c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:17 crc kubenswrapper[4763]: E1205 12:06:17.463427 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 05 12:06:17 crc kubenswrapper[4763]: E1205 12:06:17.463627 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2nz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-b9w6n_openstack-operators(cd42325e-d26d-4cb6-b8dd-f75dc86e7568): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:18 crc kubenswrapper[4763]: E1205 12:06:18.507046 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 05 12:06:18 crc kubenswrapper[4763]: E1205 12:06:18.507288 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-scncw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-mch9f_openstack-operators(63e1e64f-8414-4da8-8a32-5f0a0041c5ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:19 crc kubenswrapper[4763]: E1205 12:06:19.451384 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 05 12:06:19 crc kubenswrapper[4763]: E1205 12:06:19.451952 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9gtjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-f92rg_openstack-operators(d7dd9586-7cc5-42f0-87a8-3a8c54557b21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.394951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.405863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6-cert\") pod \"infra-operator-controller-manager-57548d458d-p5jgv\" (UID: \"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.450399 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:30 crc kubenswrapper[4763]: E1205 12:06:30.474087 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 12:06:30 crc kubenswrapper[4763]: E1205 12:06:30.474270 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78jv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-mnthh_openstack-operators(378cb9d9-8010-4dcf-9297-5e4f0679086e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:30 crc kubenswrapper[4763]: E1205 12:06:30.513306 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af: Get \"http://38.102.83.70:5001/v2/openstack-k8s-operators/watcher-operator/blobs/sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af\": context canceled" image="38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c" Dec 05 12:06:30 crc kubenswrapper[4763]: E1205 12:06:30.513658 4763 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = reading blob sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af: Get \"http://38.102.83.70:5001/v2/openstack-k8s-operators/watcher-operator/blobs/sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af\": context canceled" image="38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c" Dec 05 12:06:30 crc kubenswrapper[4763]: E1205 12:06:30.513849 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8jr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-66974974bb-mjwrw_openstack-operators(36e19ef2-df0d-43ca-8477-f1cec2182b45): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af: Get \"http://38.102.83.70:5001/v2/openstack-k8s-operators/watcher-operator/blobs/sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af\": context canceled" logger="UnhandledError" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.700032 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.706359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fdf0ecb-215d-4a02-8053-169fcbfefa50-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs\" (UID: \"3fdf0ecb-215d-4a02-8053-169fcbfefa50\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:30 crc kubenswrapper[4763]: I1205 12:06:30.825368 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:31 crc kubenswrapper[4763]: I1205 12:06:31.005065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:31 crc kubenswrapper[4763]: I1205 12:06:31.005116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:31 crc kubenswrapper[4763]: I1205 12:06:31.012496 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-webhook-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:31 crc kubenswrapper[4763]: I1205 12:06:31.015421 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43f191ee-e0a3-4d9e-a63a-c9b7a626806f-metrics-certs\") pod \"openstack-operator-controller-manager-7d6cc4d8dc-wf9nm\" (UID: \"43f191ee-e0a3-4d9e-a63a-c9b7a626806f\") " pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:31 crc kubenswrapper[4763]: I1205 12:06:31.096510 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:31 crc kubenswrapper[4763]: E1205 12:06:31.866714 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 05 12:06:31 crc kubenswrapper[4763]: E1205 12:06:31.867036 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc88h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gbf44_openstack-operators(2bed16d5-ec79-4ad7-8984-b965fa568dc6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:32 crc kubenswrapper[4763]: E1205 12:06:32.395461 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 12:06:32 crc kubenswrapper[4763]: E1205 12:06:32.396059 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsg8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gx4vn_openstack-operators(6520d187-9c6c-4b0e-b0c9-27e23db84f4c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:32 crc kubenswrapper[4763]: E1205 12:06:32.940427 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 05 12:06:32 crc kubenswrapper[4763]: E1205 12:06:32.940587 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwn78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-ht96c_openstack-operators(0a36f8ad-7e41-4005-a42e-47b9a30af62f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:33 crc kubenswrapper[4763]: E1205 12:06:33.703800 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 05 12:06:33 crc kubenswrapper[4763]: E1205 12:06:33.703995 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whb44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-gzx9n_openstack-operators(a05e3d8d-f58a-44f0-b3c9-e212cdcec438): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:33 crc kubenswrapper[4763]: E1205 12:06:33.715609 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 12:06:33 crc kubenswrapper[4763]: E1205 12:06:33.715861 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vqfmw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_openstack-operators(4faed118-8b9d-4adb-8f86-6a6be8061bce): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 12:06:33 crc kubenswrapper[4763]: E1205 12:06:33.717145 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" podUID="4faed118-8b9d-4adb-8f86-6a6be8061bce" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.455918 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.456027 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-scncw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-mch9f_openstack-operators(63e1e64f-8414-4da8-8a32-5f0a0041c5ff): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.457315 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" podUID="63e1e64f-8414-4da8-8a32-5f0a0041c5ff" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.476318 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.476467 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9vv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-n8fzm_openstack-operators(01d1c35a-adc3-4945-92b5-5921600cb826): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:06:34 crc kubenswrapper[4763]: E1205 12:06:34.478485 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podUID="01d1c35a-adc3-4945-92b5-5921600cb826" Dec 05 12:06:34 crc kubenswrapper[4763]: I1205 12:06:34.809560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv"] Dec 05 12:06:34 crc kubenswrapper[4763]: W1205 12:06:34.884797 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1fb13e0_bcd9_4cfe_be8a_a33f0e452fb6.slice/crio-b3ad9476ff58da675c99c4a200c4e6246c12b3486d238350bb0a60b22b79bfa5 WatchSource:0}: Error finding container b3ad9476ff58da675c99c4a200c4e6246c12b3486d238350bb0a60b22b79bfa5: Status 404 returned error can't find the container with id b3ad9476ff58da675c99c4a200c4e6246c12b3486d238350bb0a60b22b79bfa5 Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.141620 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm"] Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.149028 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs"] Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.208330 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" event={"ID":"e97d9ee8-0c07-486a-84f1-dabddb037a8b","Type":"ContainerStarted","Data":"944ed6751a7407415eef4c8552c09686c46facac408dc3de1b5432570d2aa4b6"} Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.211446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" event={"ID":"b64b19c9-3601-4790-addf-c9a32f6c29fe","Type":"ContainerStarted","Data":"61b6e4579d574a04a750f4013995dee85208fb5fb2e58c61ab6111a07789e053"} Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.217014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" event={"ID":"fcc46489-05d5-4219-9e45-6ca25f25900f","Type":"ContainerStarted","Data":"181b15cdb1c28aa6834a9789f1787073a5f0114e4b71f3333e1bad4f657c5486"} Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.220160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" event={"ID":"94ae68ae-93ae-43a6-89fa-5b2301808793","Type":"ContainerStarted","Data":"68d7b01a15b8762424d4bf773a60174a7299ee86b12d7324bb13cdfe0370e2aa"} Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.222203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" event={"ID":"42f31714-9ede-4c48-b611-028a79374fad","Type":"ContainerStarted","Data":"6210f5fc5cbdd56fb9228fdef10d5d38ed1718d0a15f8dd90f0a4513fe81dd40"} Dec 05 12:06:35 crc kubenswrapper[4763]: I1205 12:06:35.224544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" event={"ID":"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6","Type":"ContainerStarted","Data":"b3ad9476ff58da675c99c4a200c4e6246c12b3486d238350bb0a60b22b79bfa5"} Dec 05 12:06:35 crc kubenswrapper[4763]: E1205 12:06:35.288469 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 12:06:35 crc kubenswrapper[4763]: E1205 12:06:35.288640 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cdqgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-phnl7_openstack-operators(2024cb36-8175-4993-bd5b-a57a8fb8416c): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 12:06:35 crc kubenswrapper[4763]: E1205 12:06:35.294290 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" podUID="2024cb36-8175-4993-bd5b-a57a8fb8416c" Dec 05 12:06:35 crc kubenswrapper[4763]: W1205 12:06:35.547913 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f191ee_e0a3_4d9e_a63a_c9b7a626806f.slice/crio-e9e602f16446cfb6cd95dffd4ed0ac2bf6c9f688924d3e51b19ba1580adfa811 WatchSource:0}: Error finding container e9e602f16446cfb6cd95dffd4ed0ac2bf6c9f688924d3e51b19ba1580adfa811: Status 404 returned error can't find the container with id e9e602f16446cfb6cd95dffd4ed0ac2bf6c9f688924d3e51b19ba1580adfa811 Dec 05 12:06:36 crc kubenswrapper[4763]: I1205 12:06:36.231727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" event={"ID":"ed1b8d49-d742-4493-bb7e-856b4108fb88","Type":"ContainerStarted","Data":"c5fefd0afb601f2830bd3e40b1b716a777d75efaf7d4e3dd1b1d40371aa18e7b"} Dec 05 12:06:36 crc kubenswrapper[4763]: I1205 12:06:36.233913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" event={"ID":"43f191ee-e0a3-4d9e-a63a-c9b7a626806f","Type":"ContainerStarted","Data":"e9e602f16446cfb6cd95dffd4ed0ac2bf6c9f688924d3e51b19ba1580adfa811"} Dec 05 12:06:36 crc kubenswrapper[4763]: I1205 12:06:36.243636 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" event={"ID":"3fdf0ecb-215d-4a02-8053-169fcbfefa50","Type":"ContainerStarted","Data":"901718803b52474375201855b755ce6d50eebb823c1a6f8bb102509e9a2a7a79"} Dec 05 12:06:36 crc kubenswrapper[4763]: I1205 12:06:36.246678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" event={"ID":"eb3c8b38-a863-42d0-b7d8-03231971e4ce","Type":"ContainerStarted","Data":"a5dea5abf4e033e42e35f00f0e299b791f4039e0d0e1c47725df4e388bf8939a"} Dec 05 12:06:36 crc kubenswrapper[4763]: I1205 12:06:36.249360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" event={"ID":"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2","Type":"ContainerStarted","Data":"775c31755f8c89b10365747b5468659d80c3659dbc736451071303f35581ebb7"} Dec 05 12:06:37 crc kubenswrapper[4763]: I1205 12:06:37.256110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" event={"ID":"43f191ee-e0a3-4d9e-a63a-c9b7a626806f","Type":"ContainerStarted","Data":"c6f005c855535749603654b175c6554532ee18c182ab006e10c170e24eccbcf7"} Dec 05 12:06:37 crc kubenswrapper[4763]: I1205 12:06:37.256435 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:37 crc kubenswrapper[4763]: I1205 12:06:37.282654 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" podStartSLOduration=39.282633526 podStartE2EDuration="39.282633526s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:06:37.278325899 +0000 UTC m=+1081.771040632" watchObservedRunningTime="2025-12-05 12:06:37.282633526 +0000 UTC m=+1081.775348249" Dec 05 12:06:37 crc kubenswrapper[4763]: E1205 12:06:37.924188 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" podUID="4faed118-8b9d-4adb-8f86-6a6be8061bce" Dec 05 12:06:38 crc kubenswrapper[4763]: I1205 12:06:38.263526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" event={"ID":"4faed118-8b9d-4adb-8f86-6a6be8061bce","Type":"ContainerStarted","Data":"08c520eb29a7c5d4861dc1a05f75f4c377bd54c9066daec99298eceaab30573b"} Dec 05 12:06:38 crc kubenswrapper[4763]: I1205 12:06:38.264046 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:06:38 crc kubenswrapper[4763]: E1205 12:06:38.265092 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" podUID="4faed118-8b9d-4adb-8f86-6a6be8061bce" Dec 05 12:06:39 crc kubenswrapper[4763]: E1205 12:06:39.270779 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" podUID="4faed118-8b9d-4adb-8f86-6a6be8061bce" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.054035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" podUID="cd42325e-d26d-4cb6-b8dd-f75dc86e7568" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.109982 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.127853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.153812 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podUID="2bed16d5-ec79-4ad7-8984-b965fa568dc6" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.154240 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" podUID="378cb9d9-8010-4dcf-9297-5e4f0679086e" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.184174 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af: Get \\\"http://38.102.83.70:5001/v2/openstack-k8s-operators/watcher-operator/blobs/sha256:213dc4f279774ee434b78422f3f9f882882201ed140a0c64ced8dbdb61c0b0af\\\": context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podUID="36e19ef2-df0d-43ca-8477-f1cec2182b45" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.281567 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podUID="0a36f8ad-7e41-4005-a42e-47b9a30af62f" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.300953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" event={"ID":"cd42325e-d26d-4cb6-b8dd-f75dc86e7568","Type":"ContainerStarted","Data":"5dbbc9d7387a660feec608a65ec1530eb514447fd4296acea3ffc81908800f56"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.317058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" event={"ID":"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6","Type":"ContainerStarted","Data":"10d12758b67691153a165168101912177b840ff1c52a59129920158193e466c3"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.337288 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" event={"ID":"b64b19c9-3601-4790-addf-c9a32f6c29fe","Type":"ContainerStarted","Data":"0d2f3c75a8fe350ba6af1191449d210e90daa6b87e51adbfa661a9f9da38054b"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.338439 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.353917 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" podUID="d7dd9586-7cc5-42f0-87a8-3a8c54557b21" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.357070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.357628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" event={"ID":"94ae68ae-93ae-43a6-89fa-5b2301808793","Type":"ContainerStarted","Data":"80cecafac22432abccaa624e452da513a71358e13ea1a0b450d9627d74cd66fe"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.357704 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.369324 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podUID="6520d187-9c6c-4b0e-b0c9-27e23db84f4c" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.373378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" event={"ID":"2024cb36-8175-4993-bd5b-a57a8fb8416c","Type":"ContainerStarted","Data":"3cee34c2b8638e691b7013561161714cbe4ad6caab03495ecf262d8d9d718f46"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.373668 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.374907 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xkl6w" podStartSLOduration=3.645390488 podStartE2EDuration="43.374897442s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.695227458 +0000 UTC m=+1045.187942181" lastFinishedPulling="2025-12-05 12:06:40.424734412 +0000 UTC m=+1084.917449135" observedRunningTime="2025-12-05 12:06:41.374058425 +0000 UTC m=+1085.866773158" watchObservedRunningTime="2025-12-05 12:06:41.374897442 +0000 UTC m=+1085.867612165" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.388989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" event={"ID":"a05e3d8d-f58a-44f0-b3c9-e212cdcec438","Type":"ContainerStarted","Data":"5f977eea98f4474270e01876cabaf3afdd29e1a6a20551bc622c62707283aa96"} Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.395910 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.410087 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" event={"ID":"2bed16d5-ec79-4ad7-8984-b965fa568dc6","Type":"ContainerStarted","Data":"ecf4dcae9710991087ac8e230d2787ce63423eec3bbf32cd89c6987dcb1afcd8"} Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.415216 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podUID="2bed16d5-ec79-4ad7-8984-b965fa568dc6" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.420385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" event={"ID":"378cb9d9-8010-4dcf-9297-5e4f0679086e","Type":"ContainerStarted","Data":"daa1580e6dfdbf9b531ea89870d10d94b5ecdeef7996b72670f4f63db17595e6"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.424328 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-77g97" podStartSLOduration=2.860611103 podStartE2EDuration="43.424312111s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:05:59.968515706 +0000 UTC m=+1044.461230429" lastFinishedPulling="2025-12-05 12:06:40.532216704 +0000 UTC m=+1085.024931437" observedRunningTime="2025-12-05 12:06:41.400800544 +0000 UTC m=+1085.893515297" watchObservedRunningTime="2025-12-05 12:06:41.424312111 +0000 UTC m=+1085.917026834" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.424986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" event={"ID":"3fdf0ecb-215d-4a02-8053-169fcbfefa50","Type":"ContainerStarted","Data":"0db476f55cfb504c879b8bccc7872e5996b238e392c83e8958349b996a566429"} Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.442111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" event={"ID":"0a36f8ad-7e41-4005-a42e-47b9a30af62f","Type":"ContainerStarted","Data":"136308d6d0553e1cbd2c0e9072160602eba172d0298d4c6b9ca3838a082967c5"} Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.446990 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podUID="0a36f8ad-7e41-4005-a42e-47b9a30af62f" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.461096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" event={"ID":"36e19ef2-df0d-43ca-8477-f1cec2182b45","Type":"ContainerStarted","Data":"b6fba88f8f882ac758c9d4f179890dedeaaf7d706a85555ae33f609d932912a8"} Dec 05 12:06:41 crc kubenswrapper[4763]: E1205 12:06:41.468016 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.70:5001/openstack-k8s-operators/watcher-operator:d23b8876e1bcf18983498fca8ec9314bc8124a8c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podUID="36e19ef2-df0d-43ca-8477-f1cec2182b45" Dec 05 12:06:41 crc kubenswrapper[4763]: I1205 12:06:41.484401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" event={"ID":"63e1e64f-8414-4da8-8a32-5f0a0041c5ff","Type":"ContainerStarted","Data":"6ca380c02dc32b8e7e8eb479b2619a89f6c8124c3e4252dda4d00b2984e88b03"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.506392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" event={"ID":"eb3c8b38-a863-42d0-b7d8-03231971e4ce","Type":"ContainerStarted","Data":"943f39e3e6220cddd2272ba59b03fc2ae10edace3bc457fb5960a8c6efff0ae3"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.507318 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.508968 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.524314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" event={"ID":"cd42325e-d26d-4cb6-b8dd-f75dc86e7568","Type":"ContainerStarted","Data":"345b0141668ee0b09aff992555a8847086da3538f797cda2d6ad9ad469c885e1"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.524472 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.541470 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-sq9wf" podStartSLOduration=4.596567648 podStartE2EDuration="44.541449765s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.666251185 +0000 UTC m=+1045.158965908" lastFinishedPulling="2025-12-05 12:06:40.611133302 +0000 UTC m=+1085.103848025" observedRunningTime="2025-12-05 12:06:42.538588928 +0000 UTC m=+1087.031303671" watchObservedRunningTime="2025-12-05 12:06:42.541449765 +0000 UTC m=+1087.034164488" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.545268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" event={"ID":"e97d9ee8-0c07-486a-84f1-dabddb037a8b","Type":"ContainerStarted","Data":"c4d218c90150d78331149ebb9115113e833bef6779d5f80d349677c65a9f08ed"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.546497 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.548561 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.561073 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" podStartSLOduration=3.083409726 podStartE2EDuration="44.561049323s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.589181494 +0000 UTC m=+1045.081896217" lastFinishedPulling="2025-12-05 12:06:42.066821081 +0000 UTC m=+1086.559535814" observedRunningTime="2025-12-05 12:06:42.560541988 +0000 UTC m=+1087.053256731" watchObservedRunningTime="2025-12-05 12:06:42.561049323 +0000 UTC m=+1087.053764076" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.561315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" event={"ID":"fcc46489-05d5-4219-9e45-6ca25f25900f","Type":"ContainerStarted","Data":"a68abf589f547ef063b9e944bc60be1a45698ce88a2321434fe5f84f01583ed1"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.561509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.563563 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.564629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" event={"ID":"6520d187-9c6c-4b0e-b0c9-27e23db84f4c","Type":"ContainerStarted","Data":"a5a72cbd7f51a8fb7606dbc299d7cca26885cbbe471ae8ed88f4e631a4d8983a"} Dec 05 12:06:42 crc kubenswrapper[4763]: E1205 12:06:42.572479 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podUID="6520d187-9c6c-4b0e-b0c9-27e23db84f4c" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.574224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" event={"ID":"ed1b8d49-d742-4493-bb7e-856b4108fb88","Type":"ContainerStarted","Data":"ac735b8fdbe01649beaa43938c7de287b0ff6e9297795ac3a178df6747e2ba79"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.574974 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.576351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" event={"ID":"378cb9d9-8010-4dcf-9297-5e4f0679086e","Type":"ContainerStarted","Data":"ea51b75489024c2b91b357f60872135b30519c070cba4a6f5891b345a0ccfdd0"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.576520 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.580674 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.581904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" event={"ID":"d7dd9586-7cc5-42f0-87a8-3a8c54557b21","Type":"ContainerStarted","Data":"93cb68a954e5e50915df7a53dc16d42939ff4e87b2cc23da964e535905aae81e"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.594015 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" event={"ID":"42f31714-9ede-4c48-b611-028a79374fad","Type":"ContainerStarted","Data":"0d9824674e6bfd57a0bf2babfe7d07ba9153f85ed3f8f24b60b9ae72e2b8cc4c"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.595012 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.601451 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.645223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" event={"ID":"2024cb36-8175-4993-bd5b-a57a8fb8416c","Type":"ContainerStarted","Data":"e547d0059b6e63c54be94dd7480894c6e45083220f002819675fb26bd64877fd"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.645891 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.659546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" event={"ID":"63e1e64f-8414-4da8-8a32-5f0a0041c5ff","Type":"ContainerStarted","Data":"ec9fba5c5b27b27a9aee690f52fc869d6c567d0ce851b5b5ba718050e630c5da"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.660091 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.665995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" event={"ID":"f9a5212c-2ddb-4e82-818e-5102fb3c5ee2","Type":"ContainerStarted","Data":"cce97fedb2730241ad04168153fdad98b1cf2592d9c771d698ffb260dfd48f49"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.666622 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.673958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.683794 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" event={"ID":"f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6","Type":"ContainerStarted","Data":"541101a077cbe8701c8686e2c98e01109102b12cd930376ce46a2ca7cb3f132f"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.684438 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.685704 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6zk6f" podStartSLOduration=4.581238853 podStartE2EDuration="44.685682364s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.535969815 +0000 UTC m=+1045.028684538" lastFinishedPulling="2025-12-05 12:06:40.640413336 +0000 UTC m=+1085.133128049" observedRunningTime="2025-12-05 12:06:42.683997199 +0000 UTC m=+1087.176711942" watchObservedRunningTime="2025-12-05 12:06:42.685682364 +0000 UTC m=+1087.178397097" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.688775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" event={"ID":"3fdf0ecb-215d-4a02-8053-169fcbfefa50","Type":"ContainerStarted","Data":"f8475f774d0323c2dc31753c91042e86c5661be4435f4adb98d6fb9fef87b8d2"} Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.754438 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" podStartSLOduration=3.344054973 podStartE2EDuration="44.754421712s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.658542475 +0000 UTC m=+1045.151257198" lastFinishedPulling="2025-12-05 12:06:42.068909214 +0000 UTC m=+1086.561623937" observedRunningTime="2025-12-05 12:06:42.751859936 +0000 UTC m=+1087.244574659" watchObservedRunningTime="2025-12-05 12:06:42.754421712 +0000 UTC m=+1087.247136445" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.756662 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8w7f" podStartSLOduration=3.670273028 podStartE2EDuration="44.756653586s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:05:59.532958819 +0000 UTC m=+1044.025673532" lastFinishedPulling="2025-12-05 12:06:40.619339347 +0000 UTC m=+1085.112054090" observedRunningTime="2025-12-05 12:06:42.721347577 +0000 UTC m=+1087.214062301" watchObservedRunningTime="2025-12-05 12:06:42.756653586 +0000 UTC m=+1087.249368299" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.843674 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" podStartSLOduration=39.290931975 podStartE2EDuration="44.843652659s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:34.88847429 +0000 UTC m=+1079.381189013" lastFinishedPulling="2025-12-05 12:06:40.441194974 +0000 UTC m=+1084.933909697" observedRunningTime="2025-12-05 12:06:42.839261647 +0000 UTC m=+1087.331976380" watchObservedRunningTime="2025-12-05 12:06:42.843652659 +0000 UTC m=+1087.336367382" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.844810 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2bgt8" podStartSLOduration=4.684080096 podStartE2EDuration="44.844802618s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.480836883 +0000 UTC m=+1044.973551606" lastFinishedPulling="2025-12-05 12:06:40.641559395 +0000 UTC m=+1085.134274128" observedRunningTime="2025-12-05 12:06:42.82301685 +0000 UTC m=+1087.315731583" watchObservedRunningTime="2025-12-05 12:06:42.844802618 +0000 UTC m=+1087.337517342" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.876293 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" podStartSLOduration=5.022861817 podStartE2EDuration="44.876269003s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.588782767 +0000 UTC m=+1045.081497490" lastFinishedPulling="2025-12-05 12:06:40.442189953 +0000 UTC m=+1084.934904676" observedRunningTime="2025-12-05 12:06:42.87623184 +0000 UTC m=+1087.368946553" watchObservedRunningTime="2025-12-05 12:06:42.876269003 +0000 UTC m=+1087.368983736" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.968080 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mq9f4" podStartSLOduration=3.880658329 podStartE2EDuration="44.968060336s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:05:59.545572397 +0000 UTC m=+1044.038287120" lastFinishedPulling="2025-12-05 12:06:40.632974404 +0000 UTC m=+1085.125689127" observedRunningTime="2025-12-05 12:06:42.910461684 +0000 UTC m=+1087.403176407" watchObservedRunningTime="2025-12-05 12:06:42.968060336 +0000 UTC m=+1087.460775049" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.968220 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" podStartSLOduration=40.087910601 podStartE2EDuration="44.968216817s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:35.573934485 +0000 UTC m=+1080.066649208" lastFinishedPulling="2025-12-05 12:06:40.454240701 +0000 UTC m=+1084.946955424" observedRunningTime="2025-12-05 12:06:42.965239182 +0000 UTC m=+1087.457953905" watchObservedRunningTime="2025-12-05 12:06:42.968216817 +0000 UTC m=+1087.460931530" Dec 05 12:06:42 crc kubenswrapper[4763]: I1205 12:06:42.989607 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8gw8h" podStartSLOduration=4.908222262 podStartE2EDuration="44.989585936s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.556790517 +0000 UTC m=+1045.049505240" lastFinishedPulling="2025-12-05 12:06:40.638154201 +0000 UTC m=+1085.130868914" observedRunningTime="2025-12-05 12:06:42.982796469 +0000 UTC m=+1087.475511202" watchObservedRunningTime="2025-12-05 12:06:42.989585936 +0000 UTC m=+1087.482300659" Dec 05 12:06:43 crc kubenswrapper[4763]: I1205 12:06:43.697839 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" event={"ID":"d7dd9586-7cc5-42f0-87a8-3a8c54557b21","Type":"ContainerStarted","Data":"3b4da6364a7868bb7fba96e5baa5dc239c7cd2b9409763d3e189894f934e289a"} Dec 05 12:06:43 crc kubenswrapper[4763]: I1205 12:06:43.698594 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:43 crc kubenswrapper[4763]: I1205 12:06:43.719866 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" podStartSLOduration=5.9964625 podStartE2EDuration="45.719850833s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.678871823 +0000 UTC m=+1045.171586546" lastFinishedPulling="2025-12-05 12:06:40.402260156 +0000 UTC m=+1084.894974879" observedRunningTime="2025-12-05 12:06:43.005905399 +0000 UTC m=+1087.498620122" watchObservedRunningTime="2025-12-05 12:06:43.719850833 +0000 UTC m=+1088.212565556" Dec 05 12:06:43 crc kubenswrapper[4763]: I1205 12:06:43.722307 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" podStartSLOduration=3.301538899 podStartE2EDuration="45.722297231s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.607613122 +0000 UTC m=+1045.100327845" lastFinishedPulling="2025-12-05 12:06:43.028371454 +0000 UTC m=+1087.521086177" observedRunningTime="2025-12-05 12:06:43.715533306 +0000 UTC m=+1088.208248029" watchObservedRunningTime="2025-12-05 12:06:43.722297231 +0000 UTC m=+1088.215011964" Dec 05 12:06:44 crc kubenswrapper[4763]: I1205 12:06:44.707067 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:06:45 crc kubenswrapper[4763]: E1205 12:06:45.791387 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podUID="01d1c35a-adc3-4945-92b5-5921600cb826" Dec 05 12:06:48 crc kubenswrapper[4763]: I1205 12:06:48.782651 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-f92rg" Dec 05 12:06:48 crc kubenswrapper[4763]: I1205 12:06:48.836570 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-b9w6n" Dec 05 12:06:48 crc kubenswrapper[4763]: I1205 12:06:48.925476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" Dec 05 12:06:48 crc kubenswrapper[4763]: I1205 12:06:48.934101 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mnthh" Dec 05 12:06:49 crc kubenswrapper[4763]: I1205 12:06:49.001332 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mch9f" Dec 05 12:06:49 crc kubenswrapper[4763]: I1205 12:06:49.100089 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-phnl7" Dec 05 12:06:49 crc kubenswrapper[4763]: I1205 12:06:49.741572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" event={"ID":"4faed118-8b9d-4adb-8f86-6a6be8061bce","Type":"ContainerStarted","Data":"f46add4fad0f6217e1809da74f98521c352b5183c64fd66a5456a32443504e4d"} Dec 05 12:06:49 crc kubenswrapper[4763]: I1205 12:06:49.758978 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vqmk4" podStartSLOduration=16.926096178999998 podStartE2EDuration="51.758962523s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.678558872 +0000 UTC m=+1045.171273595" lastFinishedPulling="2025-12-05 12:06:35.511425216 +0000 UTC m=+1080.004139939" observedRunningTime="2025-12-05 12:06:49.758928192 +0000 UTC m=+1094.251642915" watchObservedRunningTime="2025-12-05 12:06:49.758962523 +0000 UTC m=+1094.251677246" Dec 05 12:06:50 crc kubenswrapper[4763]: I1205 12:06:50.457221 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-p5jgv" Dec 05 12:06:50 crc kubenswrapper[4763]: I1205 12:06:50.995469 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs" Dec 05 12:06:52 crc kubenswrapper[4763]: E1205 12:06:52.786279 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podUID="a05e3d8d-f58a-44f0-b3c9-e212cdcec438" Dec 05 12:06:56 crc kubenswrapper[4763]: I1205 12:06:56.790188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" event={"ID":"0a36f8ad-7e41-4005-a42e-47b9a30af62f","Type":"ContainerStarted","Data":"71b5cc332c0189b027f367ff32572342f42c476831d490d14cb4840bbb8e7ebf"} Dec 05 12:06:56 crc kubenswrapper[4763]: I1205 12:06:56.790798 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:06:56 crc kubenswrapper[4763]: I1205 12:06:56.811943 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" podStartSLOduration=3.094388623 podStartE2EDuration="58.811924188s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.717733906 +0000 UTC m=+1045.210448629" lastFinishedPulling="2025-12-05 12:06:56.435269471 +0000 UTC m=+1100.927984194" observedRunningTime="2025-12-05 12:06:56.811290361 +0000 UTC m=+1101.304005084" watchObservedRunningTime="2025-12-05 12:06:56.811924188 +0000 UTC m=+1101.304638931" Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.800521 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" event={"ID":"6520d187-9c6c-4b0e-b0c9-27e23db84f4c","Type":"ContainerStarted","Data":"1802a84942a10ac91be7cbb5f298d2dbbcde60c223e0f794f2e80929cd89cdd7"} Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.802690 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.807229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" event={"ID":"2bed16d5-ec79-4ad7-8984-b965fa568dc6","Type":"ContainerStarted","Data":"7e9cf74dffce42ef96ff9ad176d9efebb6e9471bd2f0960f5491d62a0e2f4bd0"} Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.807583 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.822390 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" podStartSLOduration=3.927880453 podStartE2EDuration="59.822367363s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.720113099 +0000 UTC m=+1045.212827822" lastFinishedPulling="2025-12-05 12:06:56.614600009 +0000 UTC m=+1101.107314732" observedRunningTime="2025-12-05 12:06:57.821058028 +0000 UTC m=+1102.313772771" watchObservedRunningTime="2025-12-05 12:06:57.822367363 +0000 UTC m=+1102.315082096" Dec 05 12:06:57 crc kubenswrapper[4763]: I1205 12:06:57.842399 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" podStartSLOduration=3.950953899 podStartE2EDuration="59.842379539s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.722182382 +0000 UTC m=+1045.214897105" lastFinishedPulling="2025-12-05 12:06:56.613608022 +0000 UTC m=+1101.106322745" observedRunningTime="2025-12-05 12:06:57.839315436 +0000 UTC m=+1102.332030159" watchObservedRunningTime="2025-12-05 12:06:57.842379539 +0000 UTC m=+1102.335094262" Dec 05 12:06:58 crc kubenswrapper[4763]: I1205 12:06:58.815227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" event={"ID":"36e19ef2-df0d-43ca-8477-f1cec2182b45","Type":"ContainerStarted","Data":"4ceef4dc1cbfd7eade8fe0d8e350e174c3790ee4c6d95e0ab3c6c8194bb26c2f"} Dec 05 12:06:58 crc kubenswrapper[4763]: I1205 12:06:58.832877 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" podStartSLOduration=3.445514613 podStartE2EDuration="1m0.83285516s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.721993579 +0000 UTC m=+1045.214708312" lastFinishedPulling="2025-12-05 12:06:58.109334096 +0000 UTC m=+1102.602048859" observedRunningTime="2025-12-05 12:06:58.830721498 +0000 UTC m=+1103.323436241" watchObservedRunningTime="2025-12-05 12:06:58.83285516 +0000 UTC m=+1103.325569893" Dec 05 12:06:59 crc kubenswrapper[4763]: I1205 12:06:59.272944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:07:00 crc kubenswrapper[4763]: I1205 12:07:00.829535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" event={"ID":"01d1c35a-adc3-4945-92b5-5921600cb826","Type":"ContainerStarted","Data":"8662e98199b56d0379a21b9748321dbe1c38a97b9ecb7672f89a0f60f3e93fa8"} Dec 05 12:07:00 crc kubenswrapper[4763]: I1205 12:07:00.847904 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n8fzm" podStartSLOduration=3.357554322 podStartE2EDuration="1m2.847883687s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.719667299 +0000 UTC m=+1045.212382012" lastFinishedPulling="2025-12-05 12:07:00.209996654 +0000 UTC m=+1104.702711377" observedRunningTime="2025-12-05 12:07:00.845246679 +0000 UTC m=+1105.337961402" watchObservedRunningTime="2025-12-05 12:07:00.847883687 +0000 UTC m=+1105.340598410" Dec 05 12:07:08 crc kubenswrapper[4763]: I1205 12:07:08.611523 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gbf44" Dec 05 12:07:08 crc kubenswrapper[4763]: I1205 12:07:08.712335 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gx4vn" Dec 05 12:07:09 crc kubenswrapper[4763]: I1205 12:07:09.257252 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ht96c" Dec 05 12:07:09 crc kubenswrapper[4763]: I1205 12:07:09.275056 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-66974974bb-mjwrw" Dec 05 12:07:09 crc kubenswrapper[4763]: I1205 12:07:09.896161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" event={"ID":"a05e3d8d-f58a-44f0-b3c9-e212cdcec438","Type":"ContainerStarted","Data":"dc118c4c204c62cbc9f36f79a362e7300a42e636d2078abaad851b45297492bf"} Dec 05 12:07:09 crc kubenswrapper[4763]: I1205 12:07:09.896735 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:07:09 crc kubenswrapper[4763]: I1205 12:07:09.917499 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" podStartSLOduration=5.918879029 podStartE2EDuration="1m11.917477813s" podCreationTimestamp="2025-12-05 12:05:58 +0000 UTC" firstStartedPulling="2025-12-05 12:06:00.712540589 +0000 UTC m=+1045.205255312" lastFinishedPulling="2025-12-05 12:07:06.711139373 +0000 UTC m=+1111.203854096" observedRunningTime="2025-12-05 12:07:09.910874663 +0000 UTC m=+1114.403589396" watchObservedRunningTime="2025-12-05 12:07:09.917477813 +0000 UTC m=+1114.410192546" Dec 05 12:07:19 crc kubenswrapper[4763]: I1205 12:07:19.054730 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gzx9n" Dec 05 12:07:37 crc kubenswrapper[4763]: I1205 12:07:37.544330 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:07:37 crc kubenswrapper[4763]: I1205 12:07:37.544838 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.610789 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.612689 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.617295 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.617868 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.617982 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.618103 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kzn5b" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.631097 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.689903 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.691078 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.694353 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.711723 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.715625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcxb\" (UniqueName: \"kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.715750 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.817335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.817484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.817522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrn8r\" (UniqueName: \"kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.817609 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.817675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcxb\" (UniqueName: \"kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.818437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.845784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcxb\" (UniqueName: \"kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb\") pod \"dnsmasq-dns-675f4bcbfc-j8sm6\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.919570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.919696 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.919720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrn8r\" (UniqueName: \"kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.920635 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.921141 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.939488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrn8r\" (UniqueName: \"kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r\") pod \"dnsmasq-dns-78dd6ddcc-ghlqk\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:44 crc kubenswrapper[4763]: I1205 12:07:44.942824 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:07:45 crc kubenswrapper[4763]: I1205 12:07:45.015212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:07:45 crc kubenswrapper[4763]: I1205 12:07:45.351245 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:07:45 crc kubenswrapper[4763]: I1205 12:07:45.370713 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:07:45 crc kubenswrapper[4763]: W1205 12:07:45.379252 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2040bdfc_1aef_4f74_a8bb_1caf1a98193c.slice/crio-4165b13c68e002f0c0df1bbc64604af09b9ba52f28474f594c5813c09c16bea1 WatchSource:0}: Error finding container 4165b13c68e002f0c0df1bbc64604af09b9ba52f28474f594c5813c09c16bea1: Status 404 returned error can't find the container with id 4165b13c68e002f0c0df1bbc64604af09b9ba52f28474f594c5813c09c16bea1 Dec 05 12:07:46 crc kubenswrapper[4763]: I1205 12:07:46.174468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" event={"ID":"1b5d6232-6824-4892-b75c-e46bdb919016","Type":"ContainerStarted","Data":"13456d40750253b01199d3567e75ad33fe653c46ba453d53b65318db67c25c3d"} Dec 05 12:07:46 crc kubenswrapper[4763]: I1205 12:07:46.176519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" event={"ID":"2040bdfc-1aef-4f74-a8bb-1caf1a98193c","Type":"ContainerStarted","Data":"4165b13c68e002f0c0df1bbc64604af09b9ba52f28474f594c5813c09c16bea1"} Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.820012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.825049 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.828940 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.855781 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.987829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsl2r\" (UniqueName: \"kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.987911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:47 crc kubenswrapper[4763]: I1205 12:07:47.988018 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.090352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsl2r\" (UniqueName: \"kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.090436 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.090462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.091807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.092058 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.125134 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsl2r\" (UniqueName: \"kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r\") pod \"dnsmasq-dns-666b6646f7-ck8ft\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.146878 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.165692 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.184569 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.186567 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.192029 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.295642 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.295699 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8l6h\" (UniqueName: \"kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.295835 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.397600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.397658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8l6h\" (UniqueName: \"kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.397681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.398538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.398660 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.421220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8l6h\" (UniqueName: \"kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h\") pod \"dnsmasq-dns-57d769cc4f-gp47r\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.508367 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:07:48 crc kubenswrapper[4763]: I1205 12:07:48.919622 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.017690 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.020086 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.022059 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.022268 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.022492 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.023149 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7gw5s" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.024592 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.026069 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.034239 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.043398 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114213 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114285 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqq9\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114438 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.114488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.215397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" event={"ID":"8638355d-3010-48ff-97d5-fc24b34e4743","Type":"ContainerStarted","Data":"3f4a1231125075e2a61937dab49961f6f2f86d3c1fc835f8efa57effe9b3849d"} Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.215953 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216071 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216180 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216225 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqq9\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.216847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.224358 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.235620 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.237712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.239875 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.241860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.241975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.242595 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.252690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.252745 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.252969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqq9\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.253190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.253260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: W1205 12:07:49.254020 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de9b0c2_f628_4ae1_9b10_f62e7360eb7a.slice/crio-e54766f591ff9b580c40f46cd996b7a605672f9baae3dfb86120be19fa13ab34 WatchSource:0}: Error finding container e54766f591ff9b580c40f46cd996b7a605672f9baae3dfb86120be19fa13ab34: Status 404 returned error can't find the container with id e54766f591ff9b580c40f46cd996b7a605672f9baae3dfb86120be19fa13ab34 Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.262246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.348120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.357176 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.359928 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.365084 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.365188 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.365225 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.365319 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.366094 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.366413 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6cw" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.366668 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.369792 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524064 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524157 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524650 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.524909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.525022 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.525146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9dm\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.525237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.525333 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651786 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651848 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651905 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651945 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9dm\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651976 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.651999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.652024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.652057 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.652079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.652881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.652992 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.653692 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.653976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.654330 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.654464 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.656328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.657386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.658102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.658740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.670994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9dm\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.679547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.710830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:07:49 crc kubenswrapper[4763]: I1205 12:07:49.954576 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:07:49 crc kubenswrapper[4763]: W1205 12:07:49.993358 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342a4872_4478_4b3a_a984_7fd457348435.slice/crio-1b8bce4dccb5009d80d93bdc31b23ae5c5b29d1c10c18710b83fbe9d927a3cc0 WatchSource:0}: Error finding container 1b8bce4dccb5009d80d93bdc31b23ae5c5b29d1c10c18710b83fbe9d927a3cc0: Status 404 returned error can't find the container with id 1b8bce4dccb5009d80d93bdc31b23ae5c5b29d1c10c18710b83fbe9d927a3cc0 Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.230944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" event={"ID":"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a","Type":"ContainerStarted","Data":"e54766f591ff9b580c40f46cd996b7a605672f9baae3dfb86120be19fa13ab34"} Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.232559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerStarted","Data":"1b8bce4dccb5009d80d93bdc31b23ae5c5b29d1c10c18710b83fbe9d927a3cc0"} Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.263947 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:07:50 crc kubenswrapper[4763]: W1205 12:07:50.272842 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c70640_8bf7_419d_a96f_69ac3278710c.slice/crio-237de87d14b63e636c69c36d1a31e91b68864802a658ea9059660bcbae9092d4 WatchSource:0}: Error finding container 237de87d14b63e636c69c36d1a31e91b68864802a658ea9059660bcbae9092d4: Status 404 returned error can't find the container with id 237de87d14b63e636c69c36d1a31e91b68864802a658ea9059660bcbae9092d4 Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.462457 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.464543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.468118 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.468405 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dqdbv" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.468464 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.469110 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.470484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.478631 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.614884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-kolla-config\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.614961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h5zs\" (UniqueName: \"kubernetes.io/projected/d5f22311-2f88-40cb-a35d-e0609433db1a-kube-api-access-5h5zs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.614999 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.615036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.615103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-default\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.615178 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.616247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.616337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.720821 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h5zs\" (UniqueName: \"kubernetes.io/projected/d5f22311-2f88-40cb-a35d-e0609433db1a-kube-api-access-5h5zs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.720882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.720921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.720955 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-default\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.721014 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.721041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.721096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.721128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-kolla-config\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.721294 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.722861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.724450 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-default\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.750407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d5f22311-2f88-40cb-a35d-e0609433db1a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.751062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d5f22311-2f88-40cb-a35d-e0609433db1a-kolla-config\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.761434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.841255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f22311-2f88-40cb-a35d-e0609433db1a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.878076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h5zs\" (UniqueName: \"kubernetes.io/projected/d5f22311-2f88-40cb-a35d-e0609433db1a-kube-api-access-5h5zs\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:50 crc kubenswrapper[4763]: I1205 12:07:50.892302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"d5f22311-2f88-40cb-a35d-e0609433db1a\") " pod="openstack/openstack-galera-0" Dec 05 12:07:51 crc kubenswrapper[4763]: I1205 12:07:51.091373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 12:07:51 crc kubenswrapper[4763]: I1205 12:07:51.255975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerStarted","Data":"237de87d14b63e636c69c36d1a31e91b68864802a658ea9059660bcbae9092d4"} Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.030739 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.032538 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.043089 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.043304 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.044061 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.044290 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vc89h" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.056340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.162927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163373 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.163400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/a066fad3-20a3-41d6-852d-7196f8445e2a-kube-api-access-q4wjw\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.235747 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.236704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.240324 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.240576 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fx8bd" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.240803 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.252934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271233 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/a066fad3-20a3-41d6-852d-7196f8445e2a-kube-api-access-q4wjw\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271613 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.271752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.274623 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.275998 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.276083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.276600 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.280609 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a066fad3-20a3-41d6-852d-7196f8445e2a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.283594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.300170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/a066fad3-20a3-41d6-852d-7196f8445e2a-kube-api-access-q4wjw\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.311312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a066fad3-20a3-41d6-852d-7196f8445e2a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.350514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a066fad3-20a3-41d6-852d-7196f8445e2a\") " pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.366706 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.374531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-kolla-config\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.374682 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447kt\" (UniqueName: \"kubernetes.io/projected/cc4cab9d-2172-424c-88ca-962ec052d0c3-kube-api-access-447kt\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.374717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-config-data\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.374837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.374915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.476359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.476435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-kolla-config\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.476457 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-447kt\" (UniqueName: \"kubernetes.io/projected/cc4cab9d-2172-424c-88ca-962ec052d0c3-kube-api-access-447kt\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.476484 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-config-data\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.476604 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.477956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-config-data\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.479154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc4cab9d-2172-424c-88ca-962ec052d0c3-kolla-config\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.479695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.483734 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4cab9d-2172-424c-88ca-962ec052d0c3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.501325 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-447kt\" (UniqueName: \"kubernetes.io/projected/cc4cab9d-2172-424c-88ca-962ec052d0c3-kube-api-access-447kt\") pod \"memcached-0\" (UID: \"cc4cab9d-2172-424c-88ca-962ec052d0c3\") " pod="openstack/memcached-0" Dec 05 12:07:52 crc kubenswrapper[4763]: I1205 12:07:52.571574 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.462429 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.464298 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.475366 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fl5s6" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.498214 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.626644 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkl6c\" (UniqueName: \"kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c\") pod \"kube-state-metrics-0\" (UID: \"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975\") " pod="openstack/kube-state-metrics-0" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.728605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkl6c\" (UniqueName: \"kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c\") pod \"kube-state-metrics-0\" (UID: \"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975\") " pod="openstack/kube-state-metrics-0" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.780497 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkl6c\" (UniqueName: \"kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c\") pod \"kube-state-metrics-0\" (UID: \"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975\") " pod="openstack/kube-state-metrics-0" Dec 05 12:07:54 crc kubenswrapper[4763]: I1205 12:07:54.820975 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.874122 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.876375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.883777 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.883844 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.884081 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.886719 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-htrxx" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.887005 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.888815 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.897294 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964217 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbjb\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964414 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964458 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964493 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:55 crc kubenswrapper[4763]: I1205 12:07:55.964528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.065869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.065948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbjb\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.065978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.066015 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.066048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.066108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.066156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.066194 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.067558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.075625 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.076492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.082191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.084121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.084627 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.084653 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f78e37f2b579c8ad937827f0ee3e8c91bfbfeaf465b3046f4f7d0e3c34229d24/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.086774 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbjb\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.095166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.190628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.204022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.399140 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6gw4w"] Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.400544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.401351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w"] Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.404011 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.404147 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pqf7l" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.404817 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.470579 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dzkm7"] Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-combined-ca-bundle\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483646 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8tq\" (UniqueName: \"kubernetes.io/projected/c9acbf99-ec01-4de6-9d45-418664511586-kube-api-access-7z8tq\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-log-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9acbf99-ec01-4de6-9d45-418664511586-scripts\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.483887 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-ovn-controller-tls-certs\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.487088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.493677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzkm7"] Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-etc-ovs\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585339 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-lib\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8tq\" (UniqueName: \"kubernetes.io/projected/c9acbf99-ec01-4de6-9d45-418664511586-kube-api-access-7z8tq\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-log-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585460 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-combined-ca-bundle\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-log\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-run\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-scripts\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585607 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9acbf99-ec01-4de6-9d45-418664511586-scripts\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-ovn-controller-tls-certs\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.585688 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8nm\" (UniqueName: \"kubernetes.io/projected/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-kube-api-access-ts8nm\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.586605 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-log-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.586675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run-ovn\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.586865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9acbf99-ec01-4de6-9d45-418664511586-var-run\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.588142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9acbf99-ec01-4de6-9d45-418664511586-scripts\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.592645 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-ovn-controller-tls-certs\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.609208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9acbf99-ec01-4de6-9d45-418664511586-combined-ca-bundle\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.615867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8tq\" (UniqueName: \"kubernetes.io/projected/c9acbf99-ec01-4de6-9d45-418664511586-kube-api-access-7z8tq\") pod \"ovn-controller-6gw4w\" (UID: \"c9acbf99-ec01-4de6-9d45-418664511586\") " pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8nm\" (UniqueName: \"kubernetes.io/projected/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-kube-api-access-ts8nm\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-etc-ovs\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-lib\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-log\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-run\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690875 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-scripts\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.690980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-etc-ovs\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.691053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-run\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.691226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-log\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.691244 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-var-lib\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.693172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-scripts\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.711823 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8nm\" (UniqueName: \"kubernetes.io/projected/80ec3b73-a380-499b-b4d1-054a6b2ab4a6-kube-api-access-ts8nm\") pod \"ovn-controller-ovs-dzkm7\" (UID: \"80ec3b73-a380-499b-b4d1-054a6b2ab4a6\") " pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.748752 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w" Dec 05 12:07:56 crc kubenswrapper[4763]: I1205 12:07:56.834994 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.894737 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.896842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.899226 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.900681 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.901135 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.902132 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.902380 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9hv6t" Dec 05 12:07:57 crc kubenswrapper[4763]: I1205 12:07:57.903036 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.016752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.016825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.016854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.016878 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.016897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.017047 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-config\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.017096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpk4x\" (UniqueName: \"kubernetes.io/projected/9680c542-fe6f-42cb-b48d-e17b80916e50-kube-api-access-fpk4x\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.017261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118738 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118779 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118806 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-config\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.118980 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpk4x\" (UniqueName: \"kubernetes.io/projected/9680c542-fe6f-42cb-b48d-e17b80916e50-kube-api-access-fpk4x\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.119018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.120276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.120833 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680c542-fe6f-42cb-b48d-e17b80916e50-config\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.120929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.120943 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.127685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.128637 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.145562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9680c542-fe6f-42cb-b48d-e17b80916e50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.151185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.158026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpk4x\" (UniqueName: \"kubernetes.io/projected/9680c542-fe6f-42cb-b48d-e17b80916e50-kube-api-access-fpk4x\") pod \"ovsdbserver-nb-0\" (UID: \"9680c542-fe6f-42cb-b48d-e17b80916e50\") " pod="openstack/ovsdbserver-nb-0" Dec 05 12:07:58 crc kubenswrapper[4763]: I1205 12:07:58.227146 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.839863 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.844358 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.848099 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.848256 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.848375 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.848507 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-448mr" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.868373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.903629 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904136 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76m4s\" (UniqueName: \"kubernetes.io/projected/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-kube-api-access-76m4s\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:01 crc kubenswrapper[4763]: I1205 12:08:01.904564 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006201 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006241 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76m4s\" (UniqueName: \"kubernetes.io/projected/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-kube-api-access-76m4s\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.006744 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.007709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.007893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.008687 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.016249 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.016682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.017534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.026282 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76m4s\" (UniqueName: \"kubernetes.io/projected/a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1-kube-api-access-76m4s\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.043446 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1\") " pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:02 crc kubenswrapper[4763]: I1205 12:08:02.167247 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:07 crc kubenswrapper[4763]: I1205 12:08:07.548969 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:08:07 crc kubenswrapper[4763]: I1205 12:08:07.549794 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:08:10 crc kubenswrapper[4763]: I1205 12:08:10.698912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:08:14 crc kubenswrapper[4763]: W1205 12:08:14.775239 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d77e8a1_26ef_4525_b427_0a29a9b7a0fc.slice/crio-3e1a697547d2f63fbb07348855dbb19c70d0c506b9124734a4e4b79ee1de20bd WatchSource:0}: Error finding container 3e1a697547d2f63fbb07348855dbb19c70d0c506b9124734a4e4b79ee1de20bd: Status 404 returned error can't find the container with id 3e1a697547d2f63fbb07348855dbb19c70d0c506b9124734a4e4b79ee1de20bd Dec 05 12:08:15 crc kubenswrapper[4763]: I1205 12:08:15.193694 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 12:08:15 crc kubenswrapper[4763]: W1205 12:08:15.614409 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5f22311_2f88_40cb_a35d_e0609433db1a.slice/crio-b8281bb92320883f19e69fa8e652ad46dd9705a95c170678f6e6e309c989a742 WatchSource:0}: Error finding container b8281bb92320883f19e69fa8e652ad46dd9705a95c170678f6e6e309c989a742: Status 404 returned error can't find the container with id b8281bb92320883f19e69fa8e652ad46dd9705a95c170678f6e6e309c989a742 Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.664585 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.664732 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsl2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-ck8ft_openstack(8638355d-3010-48ff-97d5-fc24b34e4743): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.666705 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" Dec 05 12:08:15 crc kubenswrapper[4763]: I1205 12:08:15.712018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d5f22311-2f88-40cb-a35d-e0609433db1a","Type":"ContainerStarted","Data":"b8281bb92320883f19e69fa8e652ad46dd9705a95c170678f6e6e309c989a742"} Dec 05 12:08:15 crc kubenswrapper[4763]: I1205 12:08:15.727328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerStarted","Data":"3e1a697547d2f63fbb07348855dbb19c70d0c506b9124734a4e4b79ee1de20bd"} Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.730369 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.782363 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.782704 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k8l6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gp47r_openstack(2de9b0c2-f628-4ae1-9b10-f62e7360eb7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.784141 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.846360 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.846493 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrn8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ghlqk_openstack(2040bdfc-1aef-4f74-a8bb-1caf1a98193c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.848311 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" podUID="2040bdfc-1aef-4f74-a8bb-1caf1a98193c" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.904023 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.904188 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wcxb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-j8sm6_openstack(1b5d6232-6824-4892-b75c-e46bdb919016): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:15 crc kubenswrapper[4763]: E1205 12:08:15.908357 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" podUID="1b5d6232-6824-4892-b75c-e46bdb919016" Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.086763 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzkm7"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.476285 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.498396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.523322 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.535877 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.598138 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 12:08:16 crc kubenswrapper[4763]: I1205 12:08:16.751091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzkm7" event={"ID":"80ec3b73-a380-499b-b4d1-054a6b2ab4a6","Type":"ContainerStarted","Data":"337aa0eed19d11d49b4478865f2bb0d34831f554dc2dc679463fab05fe9df8f7"} Dec 05 12:08:16 crc kubenswrapper[4763]: E1205 12:08:16.755845 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" Dec 05 12:08:17 crc kubenswrapper[4763]: W1205 12:08:17.119702 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9acbf99_ec01_4de6_9d45_418664511586.slice/crio-6a298235f15064df643aecf2d5d7ed1ecbef1035be132721e16ca89d44ff94f3 WatchSource:0}: Error finding container 6a298235f15064df643aecf2d5d7ed1ecbef1035be132721e16ca89d44ff94f3: Status 404 returned error can't find the container with id 6a298235f15064df643aecf2d5d7ed1ecbef1035be132721e16ca89d44ff94f3 Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.326308 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.363001 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.372503 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.425174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc\") pod \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.425307 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrn8r\" (UniqueName: \"kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r\") pod \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.425920 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2040bdfc-1aef-4f74-a8bb-1caf1a98193c" (UID: "2040bdfc-1aef-4f74-a8bb-1caf1a98193c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.427500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config\") pod \"1b5d6232-6824-4892-b75c-e46bdb919016\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.427813 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wcxb\" (UniqueName: \"kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb\") pod \"1b5d6232-6824-4892-b75c-e46bdb919016\" (UID: \"1b5d6232-6824-4892-b75c-e46bdb919016\") " Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.427975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config" (OuterVolumeSpecName: "config") pod "1b5d6232-6824-4892-b75c-e46bdb919016" (UID: "1b5d6232-6824-4892-b75c-e46bdb919016"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.428126 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config\") pod \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\" (UID: \"2040bdfc-1aef-4f74-a8bb-1caf1a98193c\") " Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.428602 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config" (OuterVolumeSpecName: "config") pod "2040bdfc-1aef-4f74-a8bb-1caf1a98193c" (UID: "2040bdfc-1aef-4f74-a8bb-1caf1a98193c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.429032 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.429054 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.429064 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d6232-6824-4892-b75c-e46bdb919016-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.433135 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb" (OuterVolumeSpecName: "kube-api-access-7wcxb") pod "1b5d6232-6824-4892-b75c-e46bdb919016" (UID: "1b5d6232-6824-4892-b75c-e46bdb919016"). InnerVolumeSpecName "kube-api-access-7wcxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.443970 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r" (OuterVolumeSpecName: "kube-api-access-rrn8r") pod "2040bdfc-1aef-4f74-a8bb-1caf1a98193c" (UID: "2040bdfc-1aef-4f74-a8bb-1caf1a98193c"). InnerVolumeSpecName "kube-api-access-rrn8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.530237 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrn8r\" (UniqueName: \"kubernetes.io/projected/2040bdfc-1aef-4f74-a8bb-1caf1a98193c-kube-api-access-rrn8r\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.530272 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wcxb\" (UniqueName: \"kubernetes.io/projected/1b5d6232-6824-4892-b75c-e46bdb919016-kube-api-access-7wcxb\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.762769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerStarted","Data":"a066dfaf62c7d9eb5aa71115e560824faff700d177e6a7a635728162a869c5e8"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.765326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerStarted","Data":"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.767407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a066fad3-20a3-41d6-852d-7196f8445e2a","Type":"ContainerStarted","Data":"239b228ec203c13435bdfaa0cada342067f31b2f24524c8ca269a1e3d2c09fac"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.768725 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.768820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ghlqk" event={"ID":"2040bdfc-1aef-4f74-a8bb-1caf1a98193c","Type":"ContainerDied","Data":"4165b13c68e002f0c0df1bbc64604af09b9ba52f28474f594c5813c09c16bea1"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.770345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cc4cab9d-2172-424c-88ca-962ec052d0c3","Type":"ContainerStarted","Data":"c910e12385b6896872d85f5cbc107f90f58dc55f29fdbabb51ff0170443c2102"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.772821 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1","Type":"ContainerStarted","Data":"d35d711b40aaee8dfdf307e3d562600a4ddd1ddfdf5f8ebb441518f4fc61402b"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.775098 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.775080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j8sm6" event={"ID":"1b5d6232-6824-4892-b75c-e46bdb919016","Type":"ContainerDied","Data":"13456d40750253b01199d3567e75ad33fe653c46ba453d53b65318db67c25c3d"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.808326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w" event={"ID":"c9acbf99-ec01-4de6-9d45-418664511586","Type":"ContainerStarted","Data":"6a298235f15064df643aecf2d5d7ed1ecbef1035be132721e16ca89d44ff94f3"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.808373 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975","Type":"ContainerStarted","Data":"255860ad9e0ffa789d62b7a278e0921ba0d69c1f9122a9d8a85477a4c013c1c9"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.808386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9680c542-fe6f-42cb-b48d-e17b80916e50","Type":"ContainerStarted","Data":"0bde72e0e75afcf2cdd0a7d61af34810b039dcb6381362a52bf314aeec21e83d"} Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.885158 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.897607 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ghlqk"] Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.928214 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:08:17 crc kubenswrapper[4763]: I1205 12:08:17.939048 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j8sm6"] Dec 05 12:08:19 crc kubenswrapper[4763]: I1205 12:08:19.794610 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5d6232-6824-4892-b75c-e46bdb919016" path="/var/lib/kubelet/pods/1b5d6232-6824-4892-b75c-e46bdb919016/volumes" Dec 05 12:08:19 crc kubenswrapper[4763]: I1205 12:08:19.795732 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2040bdfc-1aef-4f74-a8bb-1caf1a98193c" path="/var/lib/kubelet/pods/2040bdfc-1aef-4f74-a8bb-1caf1a98193c/volumes" Dec 05 12:08:19 crc kubenswrapper[4763]: I1205 12:08:19.813456 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerStarted","Data":"1c3a4d300fe3ae6cea13fc898b997f20c96630fc0223ec660a4bf7ef219e1008"} Dec 05 12:08:24 crc kubenswrapper[4763]: I1205 12:08:24.872395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9680c542-fe6f-42cb-b48d-e17b80916e50","Type":"ContainerStarted","Data":"fb8db5a6aa41de39421545f6205fa4c73917954d763ab427d262706336c0de4c"} Dec 05 12:08:24 crc kubenswrapper[4763]: I1205 12:08:24.875546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cc4cab9d-2172-424c-88ca-962ec052d0c3","Type":"ContainerStarted","Data":"b1333bee94d60122f339d6bf550d493bdd7a1b6ea5c547a0bd76440d54eaa6b1"} Dec 05 12:08:24 crc kubenswrapper[4763]: I1205 12:08:24.875871 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 12:08:24 crc kubenswrapper[4763]: I1205 12:08:24.898630 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.055106987 podStartE2EDuration="32.8986111s" podCreationTimestamp="2025-12-05 12:07:52 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.104157449 +0000 UTC m=+1181.596872192" lastFinishedPulling="2025-12-05 12:08:23.947661582 +0000 UTC m=+1188.440376305" observedRunningTime="2025-12-05 12:08:24.894941737 +0000 UTC m=+1189.387656480" watchObservedRunningTime="2025-12-05 12:08:24.8986111 +0000 UTC m=+1189.391325823" Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.957125 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d5f22311-2f88-40cb-a35d-e0609433db1a","Type":"ContainerStarted","Data":"f5cab2b00f6780e35f027a4f338c88c8e1a9297bd5e9ebac8f79218faa0933db"} Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.969986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1","Type":"ContainerStarted","Data":"d60b6f1101af13e99bdf8da7ea0728d015731b30a126a29bf184302404676923"} Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.985988 4763 generic.go:334] "Generic (PLEG): container finished" podID="80ec3b73-a380-499b-b4d1-054a6b2ab4a6" containerID="7ec57a3668a814499b3ce213dd19bd4d9ce7a548f299f3d1490e42c466577902" exitCode=0 Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.986233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzkm7" event={"ID":"80ec3b73-a380-499b-b4d1-054a6b2ab4a6","Type":"ContainerDied","Data":"7ec57a3668a814499b3ce213dd19bd4d9ce7a548f299f3d1490e42c466577902"} Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.990734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w" event={"ID":"c9acbf99-ec01-4de6-9d45-418664511586","Type":"ContainerStarted","Data":"8054ae23d4371047c17bed7f61ae19ce6f6b341f38827730a029651f374ca679"} Dec 05 12:08:25 crc kubenswrapper[4763]: I1205 12:08:25.990896 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6gw4w" Dec 05 12:08:26 crc kubenswrapper[4763]: I1205 12:08:25.999708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975","Type":"ContainerStarted","Data":"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051"} Dec 05 12:08:26 crc kubenswrapper[4763]: I1205 12:08:26.000231 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 12:08:26 crc kubenswrapper[4763]: I1205 12:08:26.003558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a066fad3-20a3-41d6-852d-7196f8445e2a","Type":"ContainerStarted","Data":"a9f61d3cfae9fed41217dc00d1d65ea11bf00de3ac92880810d443bc15928482"} Dec 05 12:08:26 crc kubenswrapper[4763]: I1205 12:08:26.114461 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6gw4w" podStartSLOduration=22.686974879 podStartE2EDuration="30.114438952s" podCreationTimestamp="2025-12-05 12:07:56 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.122526543 +0000 UTC m=+1181.615241266" lastFinishedPulling="2025-12-05 12:08:24.549990626 +0000 UTC m=+1189.042705339" observedRunningTime="2025-12-05 12:08:26.052334516 +0000 UTC m=+1190.545049259" watchObservedRunningTime="2025-12-05 12:08:26.114438952 +0000 UTC m=+1190.607153675" Dec 05 12:08:26 crc kubenswrapper[4763]: I1205 12:08:26.148542 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.615218999 podStartE2EDuration="32.148519931s" podCreationTimestamp="2025-12-05 12:07:54 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.134243034 +0000 UTC m=+1181.626957757" lastFinishedPulling="2025-12-05 12:08:24.667543966 +0000 UTC m=+1189.160258689" observedRunningTime="2025-12-05 12:08:26.134166761 +0000 UTC m=+1190.626881484" watchObservedRunningTime="2025-12-05 12:08:26.148519931 +0000 UTC m=+1190.641234654" Dec 05 12:08:27 crc kubenswrapper[4763]: I1205 12:08:27.016023 4763 generic.go:334] "Generic (PLEG): container finished" podID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerID="1c3a4d300fe3ae6cea13fc898b997f20c96630fc0223ec660a4bf7ef219e1008" exitCode=0 Dec 05 12:08:27 crc kubenswrapper[4763]: I1205 12:08:27.016113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerDied","Data":"1c3a4d300fe3ae6cea13fc898b997f20c96630fc0223ec660a4bf7ef219e1008"} Dec 05 12:08:27 crc kubenswrapper[4763]: I1205 12:08:27.022537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzkm7" event={"ID":"80ec3b73-a380-499b-b4d1-054a6b2ab4a6","Type":"ContainerStarted","Data":"bf232a77903d08ebf972eadde6b60b55fda226506023219381066e21beba799d"} Dec 05 12:08:27 crc kubenswrapper[4763]: I1205 12:08:27.022579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzkm7" event={"ID":"80ec3b73-a380-499b-b4d1-054a6b2ab4a6","Type":"ContainerStarted","Data":"ac8fe32ef020a238876d8187239bbb99dd555d286c6c3d571f3428be38692348"} Dec 05 12:08:27 crc kubenswrapper[4763]: I1205 12:08:27.063107 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dzkm7" podStartSLOduration=23.20096377 podStartE2EDuration="31.063084713s" podCreationTimestamp="2025-12-05 12:07:56 +0000 UTC" firstStartedPulling="2025-12-05 12:08:16.085849179 +0000 UTC m=+1180.578563902" lastFinishedPulling="2025-12-05 12:08:23.947970122 +0000 UTC m=+1188.440684845" observedRunningTime="2025-12-05 12:08:27.062131811 +0000 UTC m=+1191.554846544" watchObservedRunningTime="2025-12-05 12:08:27.063084713 +0000 UTC m=+1191.555799436" Dec 05 12:08:28 crc kubenswrapper[4763]: I1205 12:08:28.029406 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:08:28 crc kubenswrapper[4763]: I1205 12:08:28.029800 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:08:32 crc kubenswrapper[4763]: E1205 12:08:32.564644 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5f22311_2f88_40cb_a35d_e0609433db1a.slice/crio-conmon-f5cab2b00f6780e35f027a4f338c88c8e1a9297bd5e9ebac8f79218faa0933db.scope\": RecentStats: unable to find data in memory cache]" Dec 05 12:08:32 crc kubenswrapper[4763]: I1205 12:08:32.573682 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 12:08:33 crc kubenswrapper[4763]: I1205 12:08:33.071943 4763 generic.go:334] "Generic (PLEG): container finished" podID="a066fad3-20a3-41d6-852d-7196f8445e2a" containerID="a9f61d3cfae9fed41217dc00d1d65ea11bf00de3ac92880810d443bc15928482" exitCode=0 Dec 05 12:08:33 crc kubenswrapper[4763]: I1205 12:08:33.072017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a066fad3-20a3-41d6-852d-7196f8445e2a","Type":"ContainerDied","Data":"a9f61d3cfae9fed41217dc00d1d65ea11bf00de3ac92880810d443bc15928482"} Dec 05 12:08:33 crc kubenswrapper[4763]: I1205 12:08:33.078146 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5f22311-2f88-40cb-a35d-e0609433db1a" containerID="f5cab2b00f6780e35f027a4f338c88c8e1a9297bd5e9ebac8f79218faa0933db" exitCode=0 Dec 05 12:08:33 crc kubenswrapper[4763]: I1205 12:08:33.078195 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d5f22311-2f88-40cb-a35d-e0609433db1a","Type":"ContainerDied","Data":"f5cab2b00f6780e35f027a4f338c88c8e1a9297bd5e9ebac8f79218faa0933db"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.094291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d5f22311-2f88-40cb-a35d-e0609433db1a","Type":"ContainerStarted","Data":"7cea781db306e9185cc1ceaeaaa6fedd17ac5d4e5b43bbad72822872202277df"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.097984 4763 generic.go:334] "Generic (PLEG): container finished" podID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerID="92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16" exitCode=0 Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.098082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" event={"ID":"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a","Type":"ContainerDied","Data":"92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.100899 4763 generic.go:334] "Generic (PLEG): container finished" podID="8638355d-3010-48ff-97d5-fc24b34e4743" containerID="8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7" exitCode=0 Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.100950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" event={"ID":"8638355d-3010-48ff-97d5-fc24b34e4743","Type":"ContainerDied","Data":"8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.108575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1","Type":"ContainerStarted","Data":"f93d8e0c8cac553b742a59f006677019ac3426a36b26e682fdc872091bdbbd84"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.114217 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a066fad3-20a3-41d6-852d-7196f8445e2a","Type":"ContainerStarted","Data":"ce7198c483295660d829383c2b5f68c081d3ef0bf4f3ecb28a9f6da363a056e2"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.117325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9680c542-fe6f-42cb-b48d-e17b80916e50","Type":"ContainerStarted","Data":"e4e3156a1750fbe27212defd48f6d44766397f09df62dd4b89a1c71441826edc"} Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.119229 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=36.632579204 podStartE2EDuration="45.119208983s" podCreationTimestamp="2025-12-05 12:07:49 +0000 UTC" firstStartedPulling="2025-12-05 12:08:15.619442948 +0000 UTC m=+1180.112157671" lastFinishedPulling="2025-12-05 12:08:24.106072727 +0000 UTC m=+1188.598787450" observedRunningTime="2025-12-05 12:08:34.115571302 +0000 UTC m=+1198.608286035" watchObservedRunningTime="2025-12-05 12:08:34.119208983 +0000 UTC m=+1198.611923706" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.179865 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=36.842569801 podStartE2EDuration="44.179816599s" podCreationTimestamp="2025-12-05 12:07:50 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.213508194 +0000 UTC m=+1181.706222917" lastFinishedPulling="2025-12-05 12:08:24.550754992 +0000 UTC m=+1189.043469715" observedRunningTime="2025-12-05 12:08:34.173002992 +0000 UTC m=+1198.665717715" watchObservedRunningTime="2025-12-05 12:08:34.179816599 +0000 UTC m=+1198.672531342" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.212153 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.279095955 podStartE2EDuration="38.21213597s" podCreationTimestamp="2025-12-05 12:07:56 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.387942575 +0000 UTC m=+1181.880657298" lastFinishedPulling="2025-12-05 12:08:33.32098259 +0000 UTC m=+1197.813697313" observedRunningTime="2025-12-05 12:08:34.211116316 +0000 UTC m=+1198.703831059" watchObservedRunningTime="2025-12-05 12:08:34.21213597 +0000 UTC m=+1198.704850693" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.218182 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.136959733 podStartE2EDuration="34.218166921s" podCreationTimestamp="2025-12-05 12:08:00 +0000 UTC" firstStartedPulling="2025-12-05 12:08:17.212407217 +0000 UTC m=+1181.705121940" lastFinishedPulling="2025-12-05 12:08:33.293614405 +0000 UTC m=+1197.786329128" observedRunningTime="2025-12-05 12:08:34.194333875 +0000 UTC m=+1198.687048628" watchObservedRunningTime="2025-12-05 12:08:34.218166921 +0000 UTC m=+1198.710881664" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.227900 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.274485 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.832465 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.930801 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.974602 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.976128 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:34 crc kubenswrapper[4763]: I1205 12:08:34.992461 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.132955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.133068 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.133157 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44mck\" (UniqueName: \"kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.142029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" event={"ID":"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a","Type":"ContainerStarted","Data":"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1"} Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.142368 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.144502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" event={"ID":"8638355d-3010-48ff-97d5-fc24b34e4743","Type":"ContainerStarted","Data":"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44"} Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.145041 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.145125 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="dnsmasq-dns" containerID="cri-o://2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44" gracePeriod=10 Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.160663 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" podStartSLOduration=3.107298675 podStartE2EDuration="47.160648377s" podCreationTimestamp="2025-12-05 12:07:48 +0000 UTC" firstStartedPulling="2025-12-05 12:07:49.264428131 +0000 UTC m=+1153.757142854" lastFinishedPulling="2025-12-05 12:08:33.317777833 +0000 UTC m=+1197.810492556" observedRunningTime="2025-12-05 12:08:35.157803582 +0000 UTC m=+1199.650518305" watchObservedRunningTime="2025-12-05 12:08:35.160648377 +0000 UTC m=+1199.653363090" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.167831 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.190803 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" podStartSLOduration=3.797928784 podStartE2EDuration="48.190782174s" podCreationTimestamp="2025-12-05 12:07:47 +0000 UTC" firstStartedPulling="2025-12-05 12:07:48.944071943 +0000 UTC m=+1153.436786666" lastFinishedPulling="2025-12-05 12:08:33.336925323 +0000 UTC m=+1197.829640056" observedRunningTime="2025-12-05 12:08:35.187239516 +0000 UTC m=+1199.679954239" watchObservedRunningTime="2025-12-05 12:08:35.190782174 +0000 UTC m=+1199.683496907" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.215696 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.221497 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.235234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44mck\" (UniqueName: \"kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.235346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.237075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.237458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.238237 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.257100 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44mck\" (UniqueName: \"kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck\") pod \"dnsmasq-dns-7cb5889db5-dbfcb\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.354579 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.562030 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.603865 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.606992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.614399 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.661885 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.661958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.662197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dj4\" (UniqueName: \"kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.662218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.664336 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.670656 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config\") pod \"8638355d-3010-48ff-97d5-fc24b34e4743\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763357 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsl2r\" (UniqueName: \"kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r\") pod \"8638355d-3010-48ff-97d5-fc24b34e4743\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763421 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc\") pod \"8638355d-3010-48ff-97d5-fc24b34e4743\" (UID: \"8638355d-3010-48ff-97d5-fc24b34e4743\") " Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dj4\" (UniqueName: \"kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.763879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.766160 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-87zsg"] Dec 05 12:08:35 crc kubenswrapper[4763]: E1205 12:08:35.766634 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="init" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.766650 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="init" Dec 05 12:08:35 crc kubenswrapper[4763]: E1205 12:08:35.766679 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="dnsmasq-dns" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.766684 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="dnsmasq-dns" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.766900 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" containerName="dnsmasq-dns" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.768670 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.774150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.781153 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.782597 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.791494 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r" (OuterVolumeSpecName: "kube-api-access-wsl2r") pod "8638355d-3010-48ff-97d5-fc24b34e4743" (UID: "8638355d-3010-48ff-97d5-fc24b34e4743"). InnerVolumeSpecName "kube-api-access-wsl2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.817653 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.819113 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6shhb\" (UniqueName: \"kubernetes.io/projected/08357e6b-d21e-4b50-8af2-22ddc7398fbc-kube-api-access-6shhb\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-combined-ca-bundle\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866335 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovs-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovn-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866506 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08357e6b-d21e-4b50-8af2-22ddc7398fbc-config\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.866826 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsl2r\" (UniqueName: \"kubernetes.io/projected/8638355d-3010-48ff-97d5-fc24b34e4743-kube-api-access-wsl2r\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.868336 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-87zsg"] Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.897265 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dj4\" (UniqueName: \"kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4\") pod \"dnsmasq-dns-57d65f699f-qx8fd\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.955526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.956739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8638355d-3010-48ff-97d5-fc24b34e4743" (UID: "8638355d-3010-48ff-97d5-fc24b34e4743"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6shhb\" (UniqueName: \"kubernetes.io/projected/08357e6b-d21e-4b50-8af2-22ddc7398fbc-kube-api-access-6shhb\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-combined-ca-bundle\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovs-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovn-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976770 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08357e6b-d21e-4b50-8af2-22ddc7398fbc-config\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976797 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.976865 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.977455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovs-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.979906 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/08357e6b-d21e-4b50-8af2-22ddc7398fbc-ovn-rundir\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:35 crc kubenswrapper[4763]: I1205 12:08:35.980545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08357e6b-d21e-4b50-8af2-22ddc7398fbc-config\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.006865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-combined-ca-bundle\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.008497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config" (OuterVolumeSpecName: "config") pod "8638355d-3010-48ff-97d5-fc24b34e4743" (UID: "8638355d-3010-48ff-97d5-fc24b34e4743"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.017686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08357e6b-d21e-4b50-8af2-22ddc7398fbc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.019660 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6shhb\" (UniqueName: \"kubernetes.io/projected/08357e6b-d21e-4b50-8af2-22ddc7398fbc-kube-api-access-6shhb\") pod \"ovn-controller-metrics-87zsg\" (UID: \"08357e6b-d21e-4b50-8af2-22ddc7398fbc\") " pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.050236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.085889 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.086466 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8638355d-3010-48ff-97d5-fc24b34e4743-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.101062 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.155851 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.156107 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.156257 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ql9bh" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.156142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-87zsg" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.156308 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.176337 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.188130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-lock\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.188187 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.188223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.190433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslq7\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-kube-api-access-cslq7\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.190673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-cache\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.208716 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.233187 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.235594 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.240625 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.241222 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.245080 4763 generic.go:334] "Generic (PLEG): container finished" podID="8638355d-3010-48ff-97d5-fc24b34e4743" containerID="2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44" exitCode=0 Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.245176 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" event={"ID":"8638355d-3010-48ff-97d5-fc24b34e4743","Type":"ContainerDied","Data":"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44"} Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.245205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" event={"ID":"8638355d-3010-48ff-97d5-fc24b34e4743","Type":"ContainerDied","Data":"3f4a1231125075e2a61937dab49961f6f2f86d3c1fc835f8efa57effe9b3849d"} Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.245228 4763 scope.go:117] "RemoveContainer" containerID="2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.245375 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ck8ft" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.250140 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" event={"ID":"7fb039a0-4bf2-48bc-8725-c2fb821e19e5","Type":"ContainerStarted","Data":"6eda0400489da403065cafd357b9a9b6edc37c1bed13d3448294d27d2ca6c091"} Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.251181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.321691 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslq7\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-kube-api-access-cslq7\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgshz\" (UniqueName: \"kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324732 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.324839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-cache\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.325732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-cache\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.326672 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-lock\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.329958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.330067 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.330110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.330483 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.330506 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.330602 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:08:36.830578045 +0000 UTC m=+1201.323292778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.330747 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.327122 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1851124e-2722-4628-8e5b-63edb828d64a-lock\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.343994 4763 scope.go:117] "RemoveContainer" containerID="8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.361905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslq7\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-kube-api-access-cslq7\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.390055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.424967 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.429780 4763 scope.go:117] "RemoveContainer" containerID="2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.433860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.434719 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44\": container with ID starting with 2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44 not found: ID does not exist" containerID="2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.434827 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44"} err="failed to get container status \"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44\": rpc error: code = NotFound desc = could not find container \"2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44\": container with ID starting with 2925124dd15ddff64e52a493d35d599047b9715a95b3c55e6d1e567ab5576b44 not found: ID does not exist" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.434875 4763 scope.go:117] "RemoveContainer" containerID="8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.437425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.437970 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.438163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgshz\" (UniqueName: \"kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.438538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.438679 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.438982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.438986 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7\": container with ID starting with 8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7 not found: ID does not exist" containerID="8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.439071 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7"} err="failed to get container status \"8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7\": rpc error: code = NotFound desc = could not find container \"8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7\": container with ID starting with 8e9cc70e9f4a805b5572e39956935e53881250c03da7549bcd3bc90d6c8c62f7 not found: ID does not exist" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.439290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.439678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.446129 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ck8ft"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.484515 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pzxb5"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.486371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.498627 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.503963 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.504462 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.508510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgshz\" (UniqueName: \"kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz\") pod \"dnsmasq-dns-b8fbc5445-2sdtf\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.531833 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pzxb5"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.545709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549613 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549644 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fzs\" (UniqueName: \"kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.549925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.625696 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.650954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651126 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fzs\" (UniqueName: \"kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651186 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.651751 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.652123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.652209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.657429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.660503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.665342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.683191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fzs\" (UniqueName: \"kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs\") pod \"swift-ring-rebalance-pzxb5\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.693605 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.853732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.854031 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.854048 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: E1205 12:08:36.854099 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:08:37.854080724 +0000 UTC m=+1202.346795447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.854438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.879896 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.881585 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.899353 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.899669 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.900055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hh2tm" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.900290 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.904281 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-87zsg"] Dec 05 12:08:36 crc kubenswrapper[4763]: I1205 12:08:36.920190 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058813 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-config\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp4b4\" (UniqueName: \"kubernetes.io/projected/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-kube-api-access-pp4b4\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-scripts\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058960 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.058996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.059022 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160239 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp4b4\" (UniqueName: \"kubernetes.io/projected/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-kube-api-access-pp4b4\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-scripts\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160481 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.160652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-config\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.161221 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-scripts\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.162094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.162288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-config\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.169484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.170947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.172712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.181210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp4b4\" (UniqueName: \"kubernetes.io/projected/f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c-kube-api-access-pp4b4\") pod \"ovn-northd-0\" (UID: \"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c\") " pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.218673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:08:37 crc kubenswrapper[4763]: W1205 12:08:37.227046 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb42f35f_9a55_470e_b238_98c4c0a5b455.slice/crio-7b6f314b8912193f852e1b435f3b35b928ebf1a8a6ee2f7a14ea25288b957158 WatchSource:0}: Error finding container 7b6f314b8912193f852e1b435f3b35b928ebf1a8a6ee2f7a14ea25288b957158: Status 404 returned error can't find the container with id 7b6f314b8912193f852e1b435f3b35b928ebf1a8a6ee2f7a14ea25288b957158 Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.235600 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.282975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" event={"ID":"bb42f35f-9a55-470e-b238-98c4c0a5b455","Type":"ContainerStarted","Data":"7b6f314b8912193f852e1b435f3b35b928ebf1a8a6ee2f7a14ea25288b957158"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.285937 4763 generic.go:334] "Generic (PLEG): container finished" podID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerID="d9225f3507077d4505fdbdad764f2c5bfe8984a1f38515e57c9d73ed78b5c4c3" exitCode=0 Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.286008 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" event={"ID":"68f63de8-38ac-41e4-8c77-0cdddb57b631","Type":"ContainerDied","Data":"d9225f3507077d4505fdbdad764f2c5bfe8984a1f38515e57c9d73ed78b5c4c3"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.286027 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" event={"ID":"68f63de8-38ac-41e4-8c77-0cdddb57b631","Type":"ContainerStarted","Data":"991b797ae0e7c075115e7d0b7527fc850d091c2b951c939c44d9effb6b34646d"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.288499 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-87zsg" event={"ID":"08357e6b-d21e-4b50-8af2-22ddc7398fbc","Type":"ContainerStarted","Data":"01121ebd976b7bddc3fe603bd4b996d069f62b4754398c2ebf9ffe03ddc3a935"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.288546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-87zsg" event={"ID":"08357e6b-d21e-4b50-8af2-22ddc7398fbc","Type":"ContainerStarted","Data":"b9c2d8617e9ca95320b4359f85f8ba57cda3afd50ca39e40a1ae62c68d465809"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.294126 4763 generic.go:334] "Generic (PLEG): container finished" podID="7fb039a0-4bf2-48bc-8725-c2fb821e19e5" containerID="c4713a2fe10d85fbd7e6289ba9a786d124e70cf7c15b74c0dd3bedf603bf8373" exitCode=0 Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.294209 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" event={"ID":"7fb039a0-4bf2-48bc-8725-c2fb821e19e5","Type":"ContainerDied","Data":"c4713a2fe10d85fbd7e6289ba9a786d124e70cf7c15b74c0dd3bedf603bf8373"} Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.295132 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="dnsmasq-dns" containerID="cri-o://facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1" gracePeriod=10 Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.387522 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-87zsg" podStartSLOduration=2.387508075 podStartE2EDuration="2.387508075s" podCreationTimestamp="2025-12-05 12:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:37.385306922 +0000 UTC m=+1201.878021645" watchObservedRunningTime="2025-12-05 12:08:37.387508075 +0000 UTC m=+1201.880222788" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.456239 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pzxb5"] Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.547892 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.547953 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.547993 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.548610 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.548661 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2" gracePeriod=600 Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.726931 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.803019 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8638355d-3010-48ff-97d5-fc24b34e4743" path="/var/lib/kubelet/pods/8638355d-3010-48ff-97d5-fc24b34e4743/volumes" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.872658 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.907256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config\") pod \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.907396 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44mck\" (UniqueName: \"kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck\") pod \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.907423 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc\") pod \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\" (UID: \"7fb039a0-4bf2-48bc-8725-c2fb821e19e5\") " Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.907813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:37 crc kubenswrapper[4763]: E1205 12:08:37.910395 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:37 crc kubenswrapper[4763]: E1205 12:08:37.910602 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:37 crc kubenswrapper[4763]: E1205 12:08:37.910717 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:08:39.910701224 +0000 UTC m=+1204.403415947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.917172 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck" (OuterVolumeSpecName: "kube-api-access-44mck") pod "7fb039a0-4bf2-48bc-8725-c2fb821e19e5" (UID: "7fb039a0-4bf2-48bc-8725-c2fb821e19e5"). InnerVolumeSpecName "kube-api-access-44mck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.932731 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.933316 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config" (OuterVolumeSpecName: "config") pod "7fb039a0-4bf2-48bc-8725-c2fb821e19e5" (UID: "7fb039a0-4bf2-48bc-8725-c2fb821e19e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:37 crc kubenswrapper[4763]: I1205 12:08:37.943703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fb039a0-4bf2-48bc-8725-c2fb821e19e5" (UID: "7fb039a0-4bf2-48bc-8725-c2fb821e19e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:37 crc kubenswrapper[4763]: W1205 12:08:37.980026 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0842eda_d10a_4bb6_9f7d_a2d83fd62e2c.slice/crio-09968a0187614287d2804adfd8067d093cf7776675c28f14bb1babb1ccad221f WatchSource:0}: Error finding container 09968a0187614287d2804adfd8067d093cf7776675c28f14bb1babb1ccad221f: Status 404 returned error can't find the container with id 09968a0187614287d2804adfd8067d093cf7776675c28f14bb1babb1ccad221f Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.013973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc\") pod \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.014127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8l6h\" (UniqueName: \"kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h\") pod \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.014235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config\") pod \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\" (UID: \"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a\") " Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.015111 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44mck\" (UniqueName: \"kubernetes.io/projected/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-kube-api-access-44mck\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.015133 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.015144 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fb039a0-4bf2-48bc-8725-c2fb821e19e5-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.020825 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h" (OuterVolumeSpecName: "kube-api-access-k8l6h") pod "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" (UID: "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a"). InnerVolumeSpecName "kube-api-access-k8l6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.063592 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config" (OuterVolumeSpecName: "config") pod "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" (UID: "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.066564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" (UID: "2de9b0c2-f628-4ae1-9b10-f62e7360eb7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.116670 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.116713 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8l6h\" (UniqueName: \"kubernetes.io/projected/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-kube-api-access-k8l6h\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.116730 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.315541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pzxb5" event={"ID":"a38e41f6-6247-4c91-abba-0bc65d1c2127","Type":"ContainerStarted","Data":"e638ef0763633cea8fea65cd7e393b2e6f4be9cc45a1510f55079f9db18fb1a5"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.323876 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.324551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-dbfcb" event={"ID":"7fb039a0-4bf2-48bc-8725-c2fb821e19e5","Type":"ContainerDied","Data":"6eda0400489da403065cafd357b9a9b6edc37c1bed13d3448294d27d2ca6c091"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.324623 4763 scope.go:117] "RemoveContainer" containerID="c4713a2fe10d85fbd7e6289ba9a786d124e70cf7c15b74c0dd3bedf603bf8373" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.346833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c","Type":"ContainerStarted","Data":"09968a0187614287d2804adfd8067d093cf7776675c28f14bb1babb1ccad221f"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.352960 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerID="a89a166661ebf000a1a2380745010fa8ebff12ca3d85a6997b358d8c69c7b882" exitCode=0 Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.353030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" event={"ID":"bb42f35f-9a55-470e-b238-98c4c0a5b455","Type":"ContainerDied","Data":"a89a166661ebf000a1a2380745010fa8ebff12ca3d85a6997b358d8c69c7b882"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.364608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" event={"ID":"68f63de8-38ac-41e4-8c77-0cdddb57b631","Type":"ContainerStarted","Data":"f5d935368097de481342626f23a8da9508335c4824129e823d0bd0f7688a9759"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.364678 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.380740 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2" exitCode=0 Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.380903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.380948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.394853 4763 scope.go:117] "RemoveContainer" containerID="d9f5daa13f390f2b68ce52c3ddbc0360f2ce72002e23d581fe40bd421b3cff77" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.407842 4763 generic.go:334] "Generic (PLEG): container finished" podID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerID="facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1" exitCode=0 Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.409107 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.409422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" event={"ID":"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a","Type":"ContainerDied","Data":"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.409496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gp47r" event={"ID":"2de9b0c2-f628-4ae1-9b10-f62e7360eb7a","Type":"ContainerDied","Data":"e54766f591ff9b580c40f46cd996b7a605672f9baae3dfb86120be19fa13ab34"} Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.431736 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.441495 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-dbfcb"] Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.485419 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" podStartSLOduration=3.485393845 podStartE2EDuration="3.485393845s" podCreationTimestamp="2025-12-05 12:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:38.46370458 +0000 UTC m=+1202.956419303" watchObservedRunningTime="2025-12-05 12:08:38.485393845 +0000 UTC m=+1202.978108578" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.537909 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.538498 4763 scope.go:117] "RemoveContainer" containerID="facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.544891 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gp47r"] Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.630941 4763 scope.go:117] "RemoveContainer" containerID="92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.704110 4763 scope.go:117] "RemoveContainer" containerID="facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1" Dec 05 12:08:38 crc kubenswrapper[4763]: E1205 12:08:38.706503 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1\": container with ID starting with facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1 not found: ID does not exist" containerID="facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.706548 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1"} err="failed to get container status \"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1\": rpc error: code = NotFound desc = could not find container \"facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1\": container with ID starting with facc28983fe6b16c0a777f8ccad31cc1214fa0bd9b3694d6d792cf2d4972cda1 not found: ID does not exist" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.706608 4763 scope.go:117] "RemoveContainer" containerID="92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16" Dec 05 12:08:38 crc kubenswrapper[4763]: E1205 12:08:38.706930 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16\": container with ID starting with 92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16 not found: ID does not exist" containerID="92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16" Dec 05 12:08:38 crc kubenswrapper[4763]: I1205 12:08:38.706958 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16"} err="failed to get container status \"92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16\": rpc error: code = NotFound desc = could not find container \"92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16\": container with ID starting with 92c42e9828aa17372ed1732123d4bd158bcdf50178087a57de133be7c3b17b16 not found: ID does not exist" Dec 05 12:08:39 crc kubenswrapper[4763]: I1205 12:08:39.430345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" event={"ID":"bb42f35f-9a55-470e-b238-98c4c0a5b455","Type":"ContainerStarted","Data":"3ff5dba6344684bbb6b452f86ef96347b491be9212296ebebccae7232c3eabfe"} Dec 05 12:08:39 crc kubenswrapper[4763]: I1205 12:08:39.430822 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:39 crc kubenswrapper[4763]: I1205 12:08:39.797960 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" path="/var/lib/kubelet/pods/2de9b0c2-f628-4ae1-9b10-f62e7360eb7a/volumes" Dec 05 12:08:39 crc kubenswrapper[4763]: I1205 12:08:39.799113 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb039a0-4bf2-48bc-8725-c2fb821e19e5" path="/var/lib/kubelet/pods/7fb039a0-4bf2-48bc-8725-c2fb821e19e5/volumes" Dec 05 12:08:39 crc kubenswrapper[4763]: I1205 12:08:39.965087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:39 crc kubenswrapper[4763]: E1205 12:08:39.965362 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:39 crc kubenswrapper[4763]: E1205 12:08:39.965411 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:39 crc kubenswrapper[4763]: E1205 12:08:39.965499 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:08:43.965472281 +0000 UTC m=+1208.458187004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:41 crc kubenswrapper[4763]: I1205 12:08:41.092365 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 12:08:41 crc kubenswrapper[4763]: I1205 12:08:41.092516 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 12:08:41 crc kubenswrapper[4763]: I1205 12:08:41.177847 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 12:08:41 crc kubenswrapper[4763]: I1205 12:08:41.196490 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" podStartSLOduration=5.19647116 podStartE2EDuration="5.19647116s" podCreationTimestamp="2025-12-05 12:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:39.457683077 +0000 UTC m=+1203.950397820" watchObservedRunningTime="2025-12-05 12:08:41.19647116 +0000 UTC m=+1205.689185883" Dec 05 12:08:41 crc kubenswrapper[4763]: I1205 12:08:41.541856 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.241641 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-j9qhw"] Dec 05 12:08:42 crc kubenswrapper[4763]: E1205 12:08:42.242182 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="dnsmasq-dns" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.242207 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="dnsmasq-dns" Dec 05 12:08:42 crc kubenswrapper[4763]: E1205 12:08:42.242230 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="init" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.242241 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="init" Dec 05 12:08:42 crc kubenswrapper[4763]: E1205 12:08:42.242281 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb039a0-4bf2-48bc-8725-c2fb821e19e5" containerName="init" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.242293 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb039a0-4bf2-48bc-8725-c2fb821e19e5" containerName="init" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.242578 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de9b0c2-f628-4ae1-9b10-f62e7360eb7a" containerName="dnsmasq-dns" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.242628 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb039a0-4bf2-48bc-8725-c2fb821e19e5" containerName="init" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.243533 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.251410 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j9qhw"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.307426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.307528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9mnx\" (UniqueName: \"kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.334367 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-37d5-account-create-update-d25r6"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.335350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.338315 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.345875 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-37d5-account-create-update-d25r6"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.367101 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.367211 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.409464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9mnx\" (UniqueName: \"kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.409649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.409729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.409837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ffhq\" (UniqueName: \"kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.411664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.432872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9mnx\" (UniqueName: \"kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx\") pod \"keystone-db-create-j9qhw\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.453655 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vxwjr"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.454756 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.462816 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vxwjr"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.463700 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.513018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.513504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ffhq\" (UniqueName: \"kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.514124 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.533206 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-01df-account-create-update-9485v"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.534584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.536990 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.541860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-01df-account-create-update-9485v"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.544018 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ffhq\" (UniqueName: \"kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq\") pod \"keystone-37d5-account-create-update-d25r6\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.605931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.607189 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.615412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qz49\" (UniqueName: \"kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.615495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.669798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.722634 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-phxtr"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.723749 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qz49\" (UniqueName: \"kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730497 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730559 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcggl\" (UniqueName: \"kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfj22\" (UniqueName: \"kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.730616 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.731516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.732699 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-phxtr"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.753235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qz49\" (UniqueName: \"kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49\") pod \"placement-db-create-vxwjr\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.784798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.835709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcggl\" (UniqueName: \"kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.835796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfj22\" (UniqueName: \"kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.836179 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fc84-account-create-update-7nn47"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.836204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.836984 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.837132 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.837216 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.837901 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.840905 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.849534 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc84-account-create-update-7nn47"] Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.860456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcggl\" (UniqueName: \"kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl\") pod \"glance-db-create-phxtr\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " pod="openstack/glance-db-create-phxtr" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.864034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfj22\" (UniqueName: \"kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22\") pod \"placement-01df-account-create-update-9485v\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.883159 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.938253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:42 crc kubenswrapper[4763]: I1205 12:08:42.938380 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9crd\" (UniqueName: \"kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.039746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.039943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9crd\" (UniqueName: \"kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.040866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.051356 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phxtr" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.062695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9crd\" (UniqueName: \"kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd\") pod \"glance-fc84-account-create-update-7nn47\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:43 crc kubenswrapper[4763]: I1205 12:08:43.221040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.056100 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:44 crc kubenswrapper[4763]: E1205 12:08:44.056367 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:44 crc kubenswrapper[4763]: E1205 12:08:44.056410 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:44 crc kubenswrapper[4763]: E1205 12:08:44.056483 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:08:52.056459853 +0000 UTC m=+1216.549174576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.728385 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-s9b6d"] Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.729755 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.736239 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-s9b6d"] Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.843867 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-6476-account-create-update-g45vx"] Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.845061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.847489 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.857673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-6476-account-create-update-g45vx"] Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.869902 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4sv\" (UniqueName: \"kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.870040 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.972063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4sv\" (UniqueName: \"kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.972167 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.972374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4xc\" (UniqueName: \"kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.972420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.973236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:44 crc kubenswrapper[4763]: I1205 12:08:44.990577 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4sv\" (UniqueName: \"kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv\") pod \"watcher-db-create-s9b6d\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.063608 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.073717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.073802 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4xc\" (UniqueName: \"kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.074441 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.095476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4xc\" (UniqueName: \"kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc\") pod \"watcher-6476-account-create-update-g45vx\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.160204 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:45 crc kubenswrapper[4763]: I1205 12:08:45.957939 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:46 crc kubenswrapper[4763]: I1205 12:08:46.627753 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:08:46 crc kubenswrapper[4763]: I1205 12:08:46.682273 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:46 crc kubenswrapper[4763]: I1205 12:08:46.682543 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="dnsmasq-dns" containerID="cri-o://f5d935368097de481342626f23a8da9508335c4824129e823d0bd0f7688a9759" gracePeriod=10 Dec 05 12:08:49 crc kubenswrapper[4763]: E1205 12:08:49.278949 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b" Dec 05 12:08:49 crc kubenswrapper[4763]: E1205 12:08:49.280475 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlbjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(9d77e8a1-26ef-4525-b427-0a29a9b7a0fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.539300 4763 generic.go:334] "Generic (PLEG): container finished" podID="342a4872-4478-4b3a-a984-7fd457348435" containerID="e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2" exitCode=0 Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.539389 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerDied","Data":"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2"} Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.550832 4763 generic.go:334] "Generic (PLEG): container finished" podID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerID="f5d935368097de481342626f23a8da9508335c4824129e823d0bd0f7688a9759" exitCode=0 Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.550916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" event={"ID":"68f63de8-38ac-41e4-8c77-0cdddb57b631","Type":"ContainerDied","Data":"f5d935368097de481342626f23a8da9508335c4824129e823d0bd0f7688a9759"} Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.553433 4763 generic.go:334] "Generic (PLEG): container finished" podID="85c70640-8bf7-419d-a96f-69ac3278710c" containerID="a066dfaf62c7d9eb5aa71115e560824faff700d177e6a7a635728162a869c5e8" exitCode=0 Dec 05 12:08:49 crc kubenswrapper[4763]: I1205 12:08:49.553473 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerDied","Data":"a066dfaf62c7d9eb5aa71115e560824faff700d177e6a7a635728162a869c5e8"} Dec 05 12:08:50 crc kubenswrapper[4763]: E1205 12:08:50.401363 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified" Dec 05 12:08:50 crc kubenswrapper[4763]: E1205 12:08:50.401931 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-northd,Image:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,Command:[/usr/bin/ovn-northd],Args:[-vfile:off -vconsole:info --n-threads=1 --ovnnb-db=ssl:ovsdbserver-nb-0.openstack.svc.cluster.local:6641 --ovnsb-db=ssl:ovsdbserver-sb-0.openstack.svc.cluster.local:6642 --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n88h68ch8h575h677hb4h5dchdfh5d4h55dh54h68bhf9hf4h67ch66fh667h8fh687h98h4h548h586h578h656h58fh5fch5f5h84h5bdh97h7fq,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:certs,Value:n6dhd4h565h696hbh596h557h64bhc9h668h586h686h577hb9h5d4h65bh66ch564h677h58fh545h664h5f9h65bh5dch597h58fh655h57chc5h7bhcdq,ValueFrom:nil,},EnvVar{Name:certs_metrics,Value:n647hf4hd5h578h696h56fh5fhc8h547h59dh67bh549h6ch67bh67ch85h5bfh56h5bch59bh5bch5b7h9bh645h5d9h69h684h88h5fch55bh65bhd8q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-config,Value:n5c8h7ch56bh8dh8hc4h5dch9dh68h6bhb7h598h549h5dbh66fh6bh5b4h5cch5d6h55ch57fhfch588h89h5ddh5d6h65bh65bh8dhc4h67dh569q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-scripts,Value:n664hd8h66ch58dh64hc9h66bhd4h558h697h67bh557hdch664h567h669h555h696h556h556h5fh5bh569hbh665h9dh4h9bh564hc8h5b7h5c4q,ValueFrom:nil,},EnvVar{Name:tls-ca-bundle.pem,Value:n5fdhd7h5cbh696hcfh5b5hb6h68hc5h5d6hcbhddh666hcfh597h9ch68bh566h598h57bh77h59ch54dhb6h9ch596h5b8h75h5c7h65dh5fch66fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pp4b4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-northd-0_openstack(f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:51 crc kubenswrapper[4763]: E1205 12:08:51.225922 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified" Dec 05 12:08:51 crc kubenswrapper[4763]: E1205 12:08:51.226118 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:swift-ring-rebalance,Image:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,Command:[/usr/local/bin/swift-ring-tool all],Args:[],WorkingDir:/etc/swift,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CM_NAME,Value:swift-ring-files,ValueFrom:nil,},EnvVar{Name:NAMESPACE,Value:openstack,ValueFrom:nil,},EnvVar{Name:OWNER_APIVERSION,Value:swift.openstack.org/v1beta1,ValueFrom:nil,},EnvVar{Name:OWNER_KIND,Value:SwiftRing,ValueFrom:nil,},EnvVar{Name:OWNER_NAME,Value:swift-ring,ValueFrom:nil,},EnvVar{Name:OWNER_UID,Value:b30b998a-7da6-4723-952c-65e6754f6e25,ValueFrom:nil,},EnvVar{Name:SWIFT_MIN_PART_HOURS,Value:1,ValueFrom:nil,},EnvVar{Name:SWIFT_PART_POWER,Value:10,ValueFrom:nil,},EnvVar{Name:SWIFT_REPLICAS,Value:1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/swift-ring-tool,SubPath:swift-ring-tool,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:swiftconf,ReadOnly:true,MountPath:/etc/swift/swift.conf,SubPath:swift.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ring-data-devices,ReadOnly:true,MountPath:/var/lib/config-data/ring-devices,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dispersionconf,ReadOnly:true,MountPath:/etc/swift/dispersion.conf,SubPath:dispersion.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8fzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-ring-rebalance-pzxb5_openstack(a38e41f6-6247-4c91-abba-0bc65d1c2127): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:08:51 crc kubenswrapper[4763]: E1205 12:08:51.227695 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/swift-ring-rebalance-pzxb5" podUID="a38e41f6-6247-4c91-abba-0bc65d1c2127" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.298881 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.488778 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9dj4\" (UniqueName: \"kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4\") pod \"68f63de8-38ac-41e4-8c77-0cdddb57b631\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.489122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config\") pod \"68f63de8-38ac-41e4-8c77-0cdddb57b631\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.489249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb\") pod \"68f63de8-38ac-41e4-8c77-0cdddb57b631\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.489290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc\") pod \"68f63de8-38ac-41e4-8c77-0cdddb57b631\" (UID: \"68f63de8-38ac-41e4-8c77-0cdddb57b631\") " Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.563520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4" (OuterVolumeSpecName: "kube-api-access-s9dj4") pod "68f63de8-38ac-41e4-8c77-0cdddb57b631" (UID: "68f63de8-38ac-41e4-8c77-0cdddb57b631"). InnerVolumeSpecName "kube-api-access-s9dj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.589095 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68f63de8-38ac-41e4-8c77-0cdddb57b631" (UID: "68f63de8-38ac-41e4-8c77-0cdddb57b631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.590647 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.590675 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9dj4\" (UniqueName: \"kubernetes.io/projected/68f63de8-38ac-41e4-8c77-0cdddb57b631-kube-api-access-s9dj4\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.613425 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.613868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" event={"ID":"68f63de8-38ac-41e4-8c77-0cdddb57b631","Type":"ContainerDied","Data":"991b797ae0e7c075115e7d0b7527fc850d091c2b951c939c44d9effb6b34646d"} Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.613909 4763 scope.go:117] "RemoveContainer" containerID="f5d935368097de481342626f23a8da9508335c4824129e823d0bd0f7688a9759" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.615958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config" (OuterVolumeSpecName: "config") pod "68f63de8-38ac-41e4-8c77-0cdddb57b631" (UID: "68f63de8-38ac-41e4-8c77-0cdddb57b631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:51 crc kubenswrapper[4763]: E1205 12:08:51.628535 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified\\\"\"" pod="openstack/swift-ring-rebalance-pzxb5" podUID="a38e41f6-6247-4c91-abba-0bc65d1c2127" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.638527 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68f63de8-38ac-41e4-8c77-0cdddb57b631" (UID: "68f63de8-38ac-41e4-8c77-0cdddb57b631"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.692204 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.692236 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f63de8-38ac-41e4-8c77-0cdddb57b631-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.705557 4763 scope.go:117] "RemoveContainer" containerID="d9225f3507077d4505fdbdad764f2c5bfe8984a1f38515e57c9d73ed78b5c4c3" Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.938100 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:51 crc kubenswrapper[4763]: I1205 12:08:51.946934 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-qx8fd"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.100007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:08:52 crc kubenswrapper[4763]: E1205 12:08:52.100181 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:08:52 crc kubenswrapper[4763]: E1205 12:08:52.100195 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:08:52 crc kubenswrapper[4763]: E1205 12:08:52.100265 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:09:08.100250539 +0000 UTC m=+1232.592965262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.263985 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-s9b6d"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.274976 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-phxtr"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.287943 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j9qhw"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.303128 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-37d5-account-create-update-d25r6"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.313694 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc84-account-create-update-7nn47"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.325636 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-6476-account-create-update-g45vx"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.339835 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-01df-account-create-update-9485v"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.353817 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vxwjr"] Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.621046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerStarted","Data":"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.622178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-37d5-account-create-update-d25r6" event={"ID":"3ad4cf77-6dbb-4cd3-b404-01f3d5752403","Type":"ContainerStarted","Data":"f39dd11b4886f9384c02e1a8ed2f44ede52eca51c24534d6e5cfd5be28ece024"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.623221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6476-account-create-update-g45vx" event={"ID":"752707c1-f306-4d60-bd81-7c77a2df4e4f","Type":"ContainerStarted","Data":"ea57a668f98179a896498905d92424375d1700e049821cc5c94c286c48057cb0"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.623952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9b6d" event={"ID":"8e8af298-3905-4251-b35c-77f7a535aafb","Type":"ContainerStarted","Data":"da470bdfd573bd7fb298b01f1edd33531db0b4f24f1a23662bb647c3ffd854bb"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.624941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc84-account-create-update-7nn47" event={"ID":"ad6e2524-1d54-4e4e-834d-2176cb504743","Type":"ContainerStarted","Data":"1cba2b11f66a45ea4329b480735eccb521fb2a2fb2173d5001acdccba4eb2155"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.626294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phxtr" event={"ID":"c72e8b76-2c58-49de-af50-45474900f16f","Type":"ContainerStarted","Data":"7a7518f0db19b1c449d993b9fa7b0c998e165de024016053ee30a7597e7e05f2"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.628933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vxwjr" event={"ID":"74cdf9d1-2d3a-4822-8353-508112d2bf7d","Type":"ContainerStarted","Data":"1375c23e2222a03ef9ba97ddb5779c58ab87d7eb23e4f303d5e3b203b99e4e78"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.630039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j9qhw" event={"ID":"e09da9a3-6f2b-4b62-8953-acb7ea6a258c","Type":"ContainerStarted","Data":"079c0a95341e5ecf093d07c653998c51d39ed4171530bb8b85c875f83079766e"} Dec 05 12:08:52 crc kubenswrapper[4763]: I1205 12:08:52.631247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-01df-account-create-update-9485v" event={"ID":"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0","Type":"ContainerStarted","Data":"6d33faaf071f1fd39e93ba598de8d4a887d174d0718a9ceefcb537bbecfca685"} Dec 05 12:08:53 crc kubenswrapper[4763]: E1205 12:08:53.110661 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-northd-0" podUID="f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.643888 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j9qhw" event={"ID":"e09da9a3-6f2b-4b62-8953-acb7ea6a258c","Type":"ContainerStarted","Data":"21fa3b64b385872573ce93d0868fee7e62e4de13d2252ff5ec4066471e7743a5"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.647711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerStarted","Data":"a6d6aef1f2859e2fa062d3ea2ad9ed014fe71beeda34557c6f7694d8291c17f1"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.648554 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.651093 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c","Type":"ContainerStarted","Data":"9a290f1fe289b65a779e9198345f81ddbec9e946d3e693898bd529e2c6a499d9"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.653150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-37d5-account-create-update-d25r6" event={"ID":"3ad4cf77-6dbb-4cd3-b404-01f3d5752403","Type":"ContainerStarted","Data":"201149cd29cb763761c5882b79c4b30e8667a3b48f04fb8e287ba4a84cd99a61"} Dec 05 12:08:53 crc kubenswrapper[4763]: E1205 12:08:53.654280 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.655531 4763 generic.go:334] "Generic (PLEG): container finished" podID="c72e8b76-2c58-49de-af50-45474900f16f" containerID="abb865d962bbe48de053d972465a8cc2d32cc4b9093d72fd8daf4004e00b2abf" exitCode=0 Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.655654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phxtr" event={"ID":"c72e8b76-2c58-49de-af50-45474900f16f","Type":"ContainerDied","Data":"abb865d962bbe48de053d972465a8cc2d32cc4b9093d72fd8daf4004e00b2abf"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.659715 4763 generic.go:334] "Generic (PLEG): container finished" podID="752707c1-f306-4d60-bd81-7c77a2df4e4f" containerID="b37db4a707dd304747aa3e3a87a820f4ca28a2adfc885c6c0c7414daff659a18" exitCode=0 Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.659801 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6476-account-create-update-g45vx" event={"ID":"752707c1-f306-4d60-bd81-7c77a2df4e4f","Type":"ContainerDied","Data":"b37db4a707dd304747aa3e3a87a820f4ca28a2adfc885c6c0c7414daff659a18"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.663727 4763 generic.go:334] "Generic (PLEG): container finished" podID="8e8af298-3905-4251-b35c-77f7a535aafb" containerID="5efa931585839bf4c48d07944cf9cd309aff2021e9aa21607020150e540929cf" exitCode=0 Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.663972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9b6d" event={"ID":"8e8af298-3905-4251-b35c-77f7a535aafb","Type":"ContainerDied","Data":"5efa931585839bf4c48d07944cf9cd309aff2021e9aa21607020150e540929cf"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.667457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vxwjr" event={"ID":"74cdf9d1-2d3a-4822-8353-508112d2bf7d","Type":"ContainerStarted","Data":"3d75b35f9f557f50be0c25d534acfcb5408c8076f7f8ed83462a0d41dcdf6fe9"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.667894 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-j9qhw" podStartSLOduration=11.667870361 podStartE2EDuration="11.667870361s" podCreationTimestamp="2025-12-05 12:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:53.665527903 +0000 UTC m=+1218.158242626" watchObservedRunningTime="2025-12-05 12:08:53.667870361 +0000 UTC m=+1218.160585104" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.669951 4763 generic.go:334] "Generic (PLEG): container finished" podID="ad6e2524-1d54-4e4e-834d-2176cb504743" containerID="83b5ac1a78712d269f1d141bf38dc6972407fa66b10e8a8e74a3c544f116da69" exitCode=0 Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.670032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc84-account-create-update-7nn47" event={"ID":"ad6e2524-1d54-4e4e-834d-2176cb504743","Type":"ContainerDied","Data":"83b5ac1a78712d269f1d141bf38dc6972407fa66b10e8a8e74a3c544f116da69"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.678420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-01df-account-create-update-9485v" event={"ID":"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0","Type":"ContainerStarted","Data":"c44db6b55f8f94c4b0660d5143fadba09775e0a72ef44566ee26011bd4287cd8"} Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.678507 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.695104 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.315672575 podStartE2EDuration="1m5.69508073s" podCreationTimestamp="2025-12-05 12:07:48 +0000 UTC" firstStartedPulling="2025-12-05 12:07:50.276729979 +0000 UTC m=+1154.769444702" lastFinishedPulling="2025-12-05 12:08:15.656138134 +0000 UTC m=+1180.148852857" observedRunningTime="2025-12-05 12:08:53.693005501 +0000 UTC m=+1218.185720234" watchObservedRunningTime="2025-12-05 12:08:53.69508073 +0000 UTC m=+1218.187795463" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.741452 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-37d5-account-create-update-d25r6" podStartSLOduration=11.74143646 podStartE2EDuration="11.74143646s" podCreationTimestamp="2025-12-05 12:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:53.737398485 +0000 UTC m=+1218.230113228" watchObservedRunningTime="2025-12-05 12:08:53.74143646 +0000 UTC m=+1218.234151183" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.800151 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" path="/var/lib/kubelet/pods/68f63de8-38ac-41e4-8c77-0cdddb57b631/volumes" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.818485 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.054228654 podStartE2EDuration="1m6.818466225s" podCreationTimestamp="2025-12-05 12:07:47 +0000 UTC" firstStartedPulling="2025-12-05 12:07:50.006507657 +0000 UTC m=+1154.499222380" lastFinishedPulling="2025-12-05 12:08:14.770745158 +0000 UTC m=+1179.263459951" observedRunningTime="2025-12-05 12:08:53.812659411 +0000 UTC m=+1218.305374154" watchObservedRunningTime="2025-12-05 12:08:53.818466225 +0000 UTC m=+1218.311180948" Dec 05 12:08:53 crc kubenswrapper[4763]: I1205 12:08:53.839490 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-01df-account-create-update-9485v" podStartSLOduration=11.839472627 podStartE2EDuration="11.839472627s" podCreationTimestamp="2025-12-05 12:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:08:53.833827899 +0000 UTC m=+1218.326542622" watchObservedRunningTime="2025-12-05 12:08:53.839472627 +0000 UTC m=+1218.332187360" Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.696397 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ad4cf77-6dbb-4cd3-b404-01f3d5752403" containerID="201149cd29cb763761c5882b79c4b30e8667a3b48f04fb8e287ba4a84cd99a61" exitCode=0 Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.696968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-37d5-account-create-update-d25r6" event={"ID":"3ad4cf77-6dbb-4cd3-b404-01f3d5752403","Type":"ContainerDied","Data":"201149cd29cb763761c5882b79c4b30e8667a3b48f04fb8e287ba4a84cd99a61"} Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.699257 4763 generic.go:334] "Generic (PLEG): container finished" podID="74cdf9d1-2d3a-4822-8353-508112d2bf7d" containerID="3d75b35f9f557f50be0c25d534acfcb5408c8076f7f8ed83462a0d41dcdf6fe9" exitCode=0 Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.699303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vxwjr" event={"ID":"74cdf9d1-2d3a-4822-8353-508112d2bf7d","Type":"ContainerDied","Data":"3d75b35f9f557f50be0c25d534acfcb5408c8076f7f8ed83462a0d41dcdf6fe9"} Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.701736 4763 generic.go:334] "Generic (PLEG): container finished" podID="e09da9a3-6f2b-4b62-8953-acb7ea6a258c" containerID="21fa3b64b385872573ce93d0868fee7e62e4de13d2252ff5ec4066471e7743a5" exitCode=0 Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.701798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j9qhw" event={"ID":"e09da9a3-6f2b-4b62-8953-acb7ea6a258c","Type":"ContainerDied","Data":"21fa3b64b385872573ce93d0868fee7e62e4de13d2252ff5ec4066471e7743a5"} Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.704291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerStarted","Data":"4ca686e20e5eb1bf421130ea665b4cf9f3e9ae722261772a65c247dd39e26c24"} Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.706513 4763 generic.go:334] "Generic (PLEG): container finished" podID="b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" containerID="c44db6b55f8f94c4b0660d5143fadba09775e0a72ef44566ee26011bd4287cd8" exitCode=0 Dec 05 12:08:54 crc kubenswrapper[4763]: I1205 12:08:54.706613 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-01df-account-create-update-9485v" event={"ID":"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0","Type":"ContainerDied","Data":"c44db6b55f8f94c4b0660d5143fadba09775e0a72ef44566ee26011bd4287cd8"} Dec 05 12:08:54 crc kubenswrapper[4763]: E1205 12:08:54.710276 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.234312 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.356825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9crd\" (UniqueName: \"kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd\") pod \"ad6e2524-1d54-4e4e-834d-2176cb504743\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.356931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts\") pod \"ad6e2524-1d54-4e4e-834d-2176cb504743\" (UID: \"ad6e2524-1d54-4e4e-834d-2176cb504743\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.359060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad6e2524-1d54-4e4e-834d-2176cb504743" (UID: "ad6e2524-1d54-4e4e-834d-2176cb504743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.371152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd" (OuterVolumeSpecName: "kube-api-access-d9crd") pod "ad6e2524-1d54-4e4e-834d-2176cb504743" (UID: "ad6e2524-1d54-4e4e-834d-2176cb504743"). InnerVolumeSpecName "kube-api-access-d9crd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.397866 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.401399 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phxtr" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.412461 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.445204 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.459853 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9crd\" (UniqueName: \"kubernetes.io/projected/ad6e2524-1d54-4e4e-834d-2176cb504743-kube-api-access-d9crd\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.459891 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad6e2524-1d54-4e4e-834d-2176cb504743-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561043 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts\") pod \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts\") pod \"c72e8b76-2c58-49de-af50-45474900f16f\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561302 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qz49\" (UniqueName: \"kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49\") pod \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\" (UID: \"74cdf9d1-2d3a-4822-8353-508112d2bf7d\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw4sv\" (UniqueName: \"kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv\") pod \"8e8af298-3905-4251-b35c-77f7a535aafb\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561408 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts\") pod \"8e8af298-3905-4251-b35c-77f7a535aafb\" (UID: \"8e8af298-3905-4251-b35c-77f7a535aafb\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts\") pod \"752707c1-f306-4d60-bd81-7c77a2df4e4f\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4xc\" (UniqueName: \"kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc\") pod \"752707c1-f306-4d60-bd81-7c77a2df4e4f\" (UID: \"752707c1-f306-4d60-bd81-7c77a2df4e4f\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561578 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74cdf9d1-2d3a-4822-8353-508112d2bf7d" (UID: "74cdf9d1-2d3a-4822-8353-508112d2bf7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561595 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcggl\" (UniqueName: \"kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl\") pod \"c72e8b76-2c58-49de-af50-45474900f16f\" (UID: \"c72e8b76-2c58-49de-af50-45474900f16f\") " Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.561690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c72e8b76-2c58-49de-af50-45474900f16f" (UID: "c72e8b76-2c58-49de-af50-45474900f16f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e8af298-3905-4251-b35c-77f7a535aafb" (UID: "8e8af298-3905-4251-b35c-77f7a535aafb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "752707c1-f306-4d60-bd81-7c77a2df4e4f" (UID: "752707c1-f306-4d60-bd81-7c77a2df4e4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562629 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e8af298-3905-4251-b35c-77f7a535aafb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562665 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752707c1-f306-4d60-bd81-7c77a2df4e4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562679 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cdf9d1-2d3a-4822-8353-508112d2bf7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.562691 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72e8b76-2c58-49de-af50-45474900f16f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.565038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49" (OuterVolumeSpecName: "kube-api-access-7qz49") pod "74cdf9d1-2d3a-4822-8353-508112d2bf7d" (UID: "74cdf9d1-2d3a-4822-8353-508112d2bf7d"). InnerVolumeSpecName "kube-api-access-7qz49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.565638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc" (OuterVolumeSpecName: "kube-api-access-vd4xc") pod "752707c1-f306-4d60-bd81-7c77a2df4e4f" (UID: "752707c1-f306-4d60-bd81-7c77a2df4e4f"). InnerVolumeSpecName "kube-api-access-vd4xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.565732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv" (OuterVolumeSpecName: "kube-api-access-lw4sv") pod "8e8af298-3905-4251-b35c-77f7a535aafb" (UID: "8e8af298-3905-4251-b35c-77f7a535aafb"). InnerVolumeSpecName "kube-api-access-lw4sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.566493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl" (OuterVolumeSpecName: "kube-api-access-wcggl") pod "c72e8b76-2c58-49de-af50-45474900f16f" (UID: "c72e8b76-2c58-49de-af50-45474900f16f"). InnerVolumeSpecName "kube-api-access-wcggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.663992 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qz49\" (UniqueName: \"kubernetes.io/projected/74cdf9d1-2d3a-4822-8353-508112d2bf7d-kube-api-access-7qz49\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.664028 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw4sv\" (UniqueName: \"kubernetes.io/projected/8e8af298-3905-4251-b35c-77f7a535aafb-kube-api-access-lw4sv\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.664042 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4xc\" (UniqueName: \"kubernetes.io/projected/752707c1-f306-4d60-bd81-7c77a2df4e4f-kube-api-access-vd4xc\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.664053 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcggl\" (UniqueName: \"kubernetes.io/projected/c72e8b76-2c58-49de-af50-45474900f16f-kube-api-access-wcggl\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.715571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6476-account-create-update-g45vx" event={"ID":"752707c1-f306-4d60-bd81-7c77a2df4e4f","Type":"ContainerDied","Data":"ea57a668f98179a896498905d92424375d1700e049821cc5c94c286c48057cb0"} Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.715621 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea57a668f98179a896498905d92424375d1700e049821cc5c94c286c48057cb0" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.715657 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6476-account-create-update-g45vx" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.717224 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9b6d" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.717223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9b6d" event={"ID":"8e8af298-3905-4251-b35c-77f7a535aafb","Type":"ContainerDied","Data":"da470bdfd573bd7fb298b01f1edd33531db0b4f24f1a23662bb647c3ffd854bb"} Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.717295 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da470bdfd573bd7fb298b01f1edd33531db0b4f24f1a23662bb647c3ffd854bb" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.718825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vxwjr" event={"ID":"74cdf9d1-2d3a-4822-8353-508112d2bf7d","Type":"ContainerDied","Data":"1375c23e2222a03ef9ba97ddb5779c58ab87d7eb23e4f303d5e3b203b99e4e78"} Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.718863 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1375c23e2222a03ef9ba97ddb5779c58ab87d7eb23e4f303d5e3b203b99e4e78" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.718866 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vxwjr" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.720793 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc84-account-create-update-7nn47" event={"ID":"ad6e2524-1d54-4e4e-834d-2176cb504743","Type":"ContainerDied","Data":"1cba2b11f66a45ea4329b480735eccb521fb2a2fb2173d5001acdccba4eb2155"} Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.720819 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cba2b11f66a45ea4329b480735eccb521fb2a2fb2173d5001acdccba4eb2155" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.720862 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc84-account-create-update-7nn47" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.723233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phxtr" event={"ID":"c72e8b76-2c58-49de-af50-45474900f16f","Type":"ContainerDied","Data":"7a7518f0db19b1c449d993b9fa7b0c998e165de024016053ee30a7597e7e05f2"} Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.723266 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a7518f0db19b1c449d993b9fa7b0c998e165de024016053ee30a7597e7e05f2" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.723340 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phxtr" Dec 05 12:08:55 crc kubenswrapper[4763]: I1205 12:08:55.957421 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d65f699f-qx8fd" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.088550 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.178678 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfj22\" (UniqueName: \"kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22\") pod \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.178939 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts\") pod \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\" (UID: \"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.180657 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" (UID: "b6b89b04-af02-4afb-bdb4-99c97bcfc9e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.181977 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.183304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22" (OuterVolumeSpecName: "kube-api-access-mfj22") pod "b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" (UID: "b6b89b04-af02-4afb-bdb4-99c97bcfc9e0"). InnerVolumeSpecName "kube-api-access-mfj22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.240836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.246859 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.284187 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfj22\" (UniqueName: \"kubernetes.io/projected/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0-kube-api-access-mfj22\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.385593 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts\") pod \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.385657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ffhq\" (UniqueName: \"kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq\") pod \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.385810 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9mnx\" (UniqueName: \"kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx\") pod \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\" (UID: \"e09da9a3-6f2b-4b62-8953-acb7ea6a258c\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.385871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts\") pod \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\" (UID: \"3ad4cf77-6dbb-4cd3-b404-01f3d5752403\") " Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.386656 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ad4cf77-6dbb-4cd3-b404-01f3d5752403" (UID: "3ad4cf77-6dbb-4cd3-b404-01f3d5752403"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.388032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e09da9a3-6f2b-4b62-8953-acb7ea6a258c" (UID: "e09da9a3-6f2b-4b62-8953-acb7ea6a258c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.390062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq" (OuterVolumeSpecName: "kube-api-access-5ffhq") pod "3ad4cf77-6dbb-4cd3-b404-01f3d5752403" (UID: "3ad4cf77-6dbb-4cd3-b404-01f3d5752403"). InnerVolumeSpecName "kube-api-access-5ffhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.390442 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx" (OuterVolumeSpecName: "kube-api-access-r9mnx") pod "e09da9a3-6f2b-4b62-8953-acb7ea6a258c" (UID: "e09da9a3-6f2b-4b62-8953-acb7ea6a258c"). InnerVolumeSpecName "kube-api-access-r9mnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.489116 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.489158 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ffhq\" (UniqueName: \"kubernetes.io/projected/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-kube-api-access-5ffhq\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.489172 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9mnx\" (UniqueName: \"kubernetes.io/projected/e09da9a3-6f2b-4b62-8953-acb7ea6a258c-kube-api-access-r9mnx\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.489185 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad4cf77-6dbb-4cd3-b404-01f3d5752403-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.732173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-01df-account-create-update-9485v" event={"ID":"b6b89b04-af02-4afb-bdb4-99c97bcfc9e0","Type":"ContainerDied","Data":"6d33faaf071f1fd39e93ba598de8d4a887d174d0718a9ceefcb537bbecfca685"} Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.732212 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d33faaf071f1fd39e93ba598de8d4a887d174d0718a9ceefcb537bbecfca685" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.732865 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-01df-account-create-update-9485v" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.733664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-37d5-account-create-update-d25r6" event={"ID":"3ad4cf77-6dbb-4cd3-b404-01f3d5752403","Type":"ContainerDied","Data":"f39dd11b4886f9384c02e1a8ed2f44ede52eca51c24534d6e5cfd5be28ece024"} Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.733685 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-37d5-account-create-update-d25r6" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.733691 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39dd11b4886f9384c02e1a8ed2f44ede52eca51c24534d6e5cfd5be28ece024" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.734845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j9qhw" event={"ID":"e09da9a3-6f2b-4b62-8953-acb7ea6a258c","Type":"ContainerDied","Data":"079c0a95341e5ecf093d07c653998c51d39ed4171530bb8b85c875f83079766e"} Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.734865 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079c0a95341e5ecf093d07c653998c51d39ed4171530bb8b85c875f83079766e" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.734890 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j9qhw" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.808316 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6gw4w" podUID="c9acbf99-ec01-4de6-9d45-418664511586" containerName="ovn-controller" probeResult="failure" output=< Dec 05 12:08:56 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 12:08:56 crc kubenswrapper[4763]: > Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.881602 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:08:56 crc kubenswrapper[4763]: I1205 12:08:56.882923 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzkm7" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.121802 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6gw4w-config-52xsk"] Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.122451 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8af298-3905-4251-b35c-77f7a535aafb" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.122621 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8af298-3905-4251-b35c-77f7a535aafb" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.122715 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="dnsmasq-dns" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.122820 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="dnsmasq-dns" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.122923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72e8b76-2c58-49de-af50-45474900f16f" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123001 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72e8b76-2c58-49de-af50-45474900f16f" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123110 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="init" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123198 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="init" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123300 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cdf9d1-2d3a-4822-8353-508112d2bf7d" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123376 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cdf9d1-2d3a-4822-8353-508112d2bf7d" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123447 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09da9a3-6f2b-4b62-8953-acb7ea6a258c" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123512 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09da9a3-6f2b-4b62-8953-acb7ea6a258c" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123567 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123622 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123691 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6e2524-1d54-4e4e-834d-2176cb504743" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.123796 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6e2524-1d54-4e4e-834d-2176cb504743" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.123935 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad4cf77-6dbb-4cd3-b404-01f3d5752403" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124040 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad4cf77-6dbb-4cd3-b404-01f3d5752403" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.124119 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752707c1-f306-4d60-bd81-7c77a2df4e4f" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124190 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="752707c1-f306-4d60-bd81-7c77a2df4e4f" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124477 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6e2524-1d54-4e4e-834d-2176cb504743" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124569 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="752707c1-f306-4d60-bd81-7c77a2df4e4f" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124667 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09da9a3-6f2b-4b62-8953-acb7ea6a258c" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124781 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.124874 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72e8b76-2c58-49de-af50-45474900f16f" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.126983 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8af298-3905-4251-b35c-77f7a535aafb" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.127186 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad4cf77-6dbb-4cd3-b404-01f3d5752403" containerName="mariadb-account-create-update" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.127293 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f63de8-38ac-41e4-8c77-0cdddb57b631" containerName="dnsmasq-dns" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.127375 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cdf9d1-2d3a-4822-8353-508112d2bf7d" containerName="mariadb-database-create" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.128123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.129778 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.142782 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w-config-52xsk"] Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.320482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.320945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.321067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.321098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.321126 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9td8\" (UniqueName: \"kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.321152 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.422787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.422912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.422932 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.422952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9td8\" (UniqueName: \"kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.422974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.423005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.423360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.423367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.423674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.423845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.430368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.441788 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9td8\" (UniqueName: \"kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8\") pod \"ovn-controller-6gw4w-config-52xsk\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.507285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.677228 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.751254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerStarted","Data":"eb621e9277be1610b485a0015c8f75be9dd554027d30b32d46abff3571cc4c29"} Dec 05 12:08:57 crc kubenswrapper[4763]: E1205 12:08:57.752720 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.961979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w-config-52xsk"] Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.976742 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vh57r"] Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.980622 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.986183 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.986316 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5tbdp" Dec 05 12:08:57 crc kubenswrapper[4763]: I1205 12:08:57.993709 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vh57r"] Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.135044 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.135351 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.135425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbp8\" (UniqueName: \"kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.135541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.238790 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.238877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.238900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.238950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbp8\" (UniqueName: \"kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.245480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.255585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.257306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.257726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbp8\" (UniqueName: \"kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8\") pod \"glance-db-sync-vh57r\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.324401 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vh57r" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.763000 4763 generic.go:334] "Generic (PLEG): container finished" podID="92548acb-045e-4d21-a684-e52051972a6f" containerID="782737352cff36ff63ed021324bca8ccf3ce1fe162fe80c46e3a48b895be618b" exitCode=0 Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.763653 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-52xsk" event={"ID":"92548acb-045e-4d21-a684-e52051972a6f","Type":"ContainerDied","Data":"782737352cff36ff63ed021324bca8ccf3ce1fe162fe80c46e3a48b895be618b"} Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.763996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-52xsk" event={"ID":"92548acb-045e-4d21-a684-e52051972a6f","Type":"ContainerStarted","Data":"49a917fe5d393187bfd97a77f14d9f3ae3ff4a31f40695e8481a367535d42793"} Dec 05 12:08:58 crc kubenswrapper[4763]: E1205 12:08:58.766961 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" Dec 05 12:08:58 crc kubenswrapper[4763]: I1205 12:08:58.876217 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vh57r"] Dec 05 12:08:58 crc kubenswrapper[4763]: W1205 12:08:58.878375 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af45fe0_0c3c_4394_82fb_334e1f6e7cb1.slice/crio-8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87 WatchSource:0}: Error finding container 8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87: Status 404 returned error can't find the container with id 8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87 Dec 05 12:08:59 crc kubenswrapper[4763]: I1205 12:08:59.773114 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vh57r" event={"ID":"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1","Type":"ContainerStarted","Data":"8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87"} Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.113895 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.273973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9td8\" (UniqueName: \"kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.274096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.274184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.274203 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.274233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.274253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts\") pod \"92548acb-045e-4d21-a684-e52051972a6f\" (UID: \"92548acb-045e-4d21-a684-e52051972a6f\") " Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.275033 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.275181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.275227 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.275246 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run" (OuterVolumeSpecName: "var-run") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.275696 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts" (OuterVolumeSpecName: "scripts") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.280978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8" (OuterVolumeSpecName: "kube-api-access-w9td8") pod "92548acb-045e-4d21-a684-e52051972a6f" (UID: "92548acb-045e-4d21-a684-e52051972a6f"). InnerVolumeSpecName "kube-api-access-w9td8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376581 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376626 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376639 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/92548acb-045e-4d21-a684-e52051972a6f-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376652 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376664 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9td8\" (UniqueName: \"kubernetes.io/projected/92548acb-045e-4d21-a684-e52051972a6f-kube-api-access-w9td8\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.376676 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92548acb-045e-4d21-a684-e52051972a6f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.780378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-52xsk" event={"ID":"92548acb-045e-4d21-a684-e52051972a6f","Type":"ContainerDied","Data":"49a917fe5d393187bfd97a77f14d9f3ae3ff4a31f40695e8481a367535d42793"} Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.780414 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a917fe5d393187bfd97a77f14d9f3ae3ff4a31f40695e8481a367535d42793" Dec 05 12:09:00 crc kubenswrapper[4763]: I1205 12:09:00.780437 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-52xsk" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.250444 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6gw4w-config-52xsk"] Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.261055 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6gw4w-config-52xsk"] Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.346730 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6gw4w-config-s6bjl"] Dec 05 12:09:01 crc kubenswrapper[4763]: E1205 12:09:01.347133 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92548acb-045e-4d21-a684-e52051972a6f" containerName="ovn-config" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.347153 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="92548acb-045e-4d21-a684-e52051972a6f" containerName="ovn-config" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.347326 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="92548acb-045e-4d21-a684-e52051972a6f" containerName="ovn-config" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.347903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.350974 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.357160 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w-config-s6bjl"] Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.497896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.497965 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.498054 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.498080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.498116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsbnq\" (UniqueName: \"kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.498226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600821 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsbnq\" (UniqueName: \"kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.600913 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.601233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.602155 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.602257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.602578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.603500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.622638 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsbnq\" (UniqueName: \"kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq\") pod \"ovn-controller-6gw4w-config-s6bjl\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.663645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.806817 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92548acb-045e-4d21-a684-e52051972a6f" path="/var/lib/kubelet/pods/92548acb-045e-4d21-a684-e52051972a6f/volumes" Dec 05 12:09:01 crc kubenswrapper[4763]: I1205 12:09:01.809004 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6gw4w" Dec 05 12:09:02 crc kubenswrapper[4763]: I1205 12:09:02.131627 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6gw4w-config-s6bjl"] Dec 05 12:09:02 crc kubenswrapper[4763]: W1205 12:09:02.135874 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f261232_6f96_464b_a44a_6423aaa9feb6.slice/crio-f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6 WatchSource:0}: Error finding container f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6: Status 404 returned error can't find the container with id f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6 Dec 05 12:09:02 crc kubenswrapper[4763]: I1205 12:09:02.809062 4763 generic.go:334] "Generic (PLEG): container finished" podID="2f261232-6f96-464b-a44a-6423aaa9feb6" containerID="1cf3dcc349b15d3a63b8082eea8f18ad7b5e6394db73867be341fb3080e88c6c" exitCode=0 Dec 05 12:09:02 crc kubenswrapper[4763]: I1205 12:09:02.809343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-s6bjl" event={"ID":"2f261232-6f96-464b-a44a-6423aaa9feb6","Type":"ContainerDied","Data":"1cf3dcc349b15d3a63b8082eea8f18ad7b5e6394db73867be341fb3080e88c6c"} Dec 05 12:09:02 crc kubenswrapper[4763]: I1205 12:09:02.809374 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-s6bjl" event={"ID":"2f261232-6f96-464b-a44a-6423aaa9feb6","Type":"ContainerStarted","Data":"f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6"} Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.182027 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.355521 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.355946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.356053 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsbnq\" (UniqueName: \"kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.356083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.356114 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.356157 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn\") pod \"2f261232-6f96-464b-a44a-6423aaa9feb6\" (UID: \"2f261232-6f96-464b-a44a-6423aaa9feb6\") " Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.356650 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.357189 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.358029 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.358137 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run" (OuterVolumeSpecName: "var-run") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.358277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts" (OuterVolumeSpecName: "scripts") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.378220 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq" (OuterVolumeSpecName: "kube-api-access-bsbnq") pod "2f261232-6f96-464b-a44a-6423aaa9feb6" (UID: "2f261232-6f96-464b-a44a-6423aaa9feb6"). InnerVolumeSpecName "kube-api-access-bsbnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458227 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458257 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f261232-6f96-464b-a44a-6423aaa9feb6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458270 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsbnq\" (UniqueName: \"kubernetes.io/projected/2f261232-6f96-464b-a44a-6423aaa9feb6-kube-api-access-bsbnq\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458284 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458296 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.458307 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f261232-6f96-464b-a44a-6423aaa9feb6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.827885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6gw4w-config-s6bjl" event={"ID":"2f261232-6f96-464b-a44a-6423aaa9feb6","Type":"ContainerDied","Data":"f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6"} Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.827940 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6gw4w-config-s6bjl" Dec 05 12:09:04 crc kubenswrapper[4763]: I1205 12:09:04.827958 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f090050051e2498ec0dabb71645737e0f950e3c33de90b0ce60edb56a61860a6" Dec 05 12:09:05 crc kubenswrapper[4763]: I1205 12:09:05.252674 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6gw4w-config-s6bjl"] Dec 05 12:09:05 crc kubenswrapper[4763]: I1205 12:09:05.260005 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6gw4w-config-s6bjl"] Dec 05 12:09:05 crc kubenswrapper[4763]: I1205 12:09:05.795806 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f261232-6f96-464b-a44a-6423aaa9feb6" path="/var/lib/kubelet/pods/2f261232-6f96-464b-a44a-6423aaa9feb6/volumes" Dec 05 12:09:08 crc kubenswrapper[4763]: I1205 12:09:08.124023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:09:08 crc kubenswrapper[4763]: E1205 12:09:08.124474 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 12:09:08 crc kubenswrapper[4763]: E1205 12:09:08.124523 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 12:09:08 crc kubenswrapper[4763]: E1205 12:09:08.124625 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift podName:1851124e-2722-4628-8e5b-63edb828d64a nodeName:}" failed. No retries permitted until 2025-12-05 12:09:40.124588369 +0000 UTC m=+1264.617303092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift") pod "swift-storage-0" (UID: "1851124e-2722-4628-8e5b-63edb828d64a") : configmap "swift-ring-files" not found Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.351974 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.651806 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fdrsd"] Dec 05 12:09:09 crc kubenswrapper[4763]: E1205 12:09:09.652248 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f261232-6f96-464b-a44a-6423aaa9feb6" containerName="ovn-config" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.652262 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f261232-6f96-464b-a44a-6423aaa9feb6" containerName="ovn-config" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.652470 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f261232-6f96-464b-a44a-6423aaa9feb6" containerName="ovn-config" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.653223 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.687714 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fdrsd"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.740873 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.754656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tpkg\" (UniqueName: \"kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.754725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.770478 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9c79-account-create-update-n7z54"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.771890 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.774561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.806143 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9c79-account-create-update-n7z54"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.808599 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-wcz88"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.810021 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.812458 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.818065 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-zcxcw" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.829856 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-wcz88"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.851668 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-m5zs5"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.852976 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858638 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858692 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhbq\" (UniqueName: \"kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tpkg\" (UniqueName: \"kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858820 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.858874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.860078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.889614 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m5zs5"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.950748 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d754-account-create-update-n6glx"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.954061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.956223 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d754-account-create-update-n6glx"] Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.961644 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.961663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88h2\" (UniqueName: \"kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.962037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.962088 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhbq\" (UniqueName: \"kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.962198 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.962313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.962464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.972154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.979342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.983579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:09 crc kubenswrapper[4763]: I1205 12:09:09.997870 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhbq\" (UniqueName: \"kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq\") pod \"watcher-db-sync-wcz88\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.031800 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-s2dcm"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.035251 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.048737 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s2dcm"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mpv\" (UniqueName: \"kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064688 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88h2\" (UniqueName: \"kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.064799 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndj6r\" (UniqueName: \"kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.065299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.106295 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-d8c46"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.109610 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.112986 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.113325 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gw888" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.113487 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.114515 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.115265 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88h2\" (UniqueName: \"kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2\") pod \"barbican-9c79-account-create-update-n7z54\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.135375 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d8c46"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.159457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wcz88" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.165908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.165968 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.165994 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mpv\" (UniqueName: \"kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.166051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.166118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndj6r\" (UniqueName: \"kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.166192 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv58h\" (UniqueName: \"kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.167123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.167992 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.173262 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7548-account-create-update-2cptw"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.174589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.179283 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.184065 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7548-account-create-update-2cptw"] Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.204268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mpv\" (UniqueName: \"kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv\") pod \"cinder-d754-account-create-update-n6glx\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.219811 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndj6r\" (UniqueName: \"kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r\") pod \"cinder-db-create-m5zs5\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271457 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271515 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9fj\" (UniqueName: \"kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271659 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271718 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrk9\" (UniqueName: \"kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.271845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv58h\" (UniqueName: \"kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.274048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.295172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv58h\" (UniqueName: \"kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h\") pod \"neutron-db-create-s2dcm\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.347230 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.360023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.375676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9fj\" (UniqueName: \"kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.375723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.375864 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrk9\" (UniqueName: \"kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.376013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.376049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.376400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.379500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.380385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.394001 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9fj\" (UniqueName: \"kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj\") pod \"keystone-db-sync-d8c46\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.394377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrk9\" (UniqueName: \"kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9\") pod \"neutron-7548-account-create-update-2cptw\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.395984 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.452146 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.506211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.588370 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.599695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tpkg\" (UniqueName: \"kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg\") pod \"barbican-db-create-fdrsd\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:10 crc kubenswrapper[4763]: I1205 12:09:10.879552 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:13 crc kubenswrapper[4763]: E1205 12:09:13.912872 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 05 12:09:13 crc kubenswrapper[4763]: E1205 12:09:13.914228 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dbp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-vh57r_openstack(0af45fe0-0c3c-4394-82fb-334e1f6e7cb1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:09:13 crc kubenswrapper[4763]: E1205 12:09:13.915515 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-vh57r" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.357710 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9c79-account-create-update-n7z54"] Dec 05 12:09:14 crc kubenswrapper[4763]: W1205 12:09:14.405545 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd64503e5_6fd1_49fa_b025_7c00f8b245c3.slice/crio-4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135 WatchSource:0}: Error finding container 4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135: Status 404 returned error can't find the container with id 4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135 Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.590206 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s2dcm"] Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.610230 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d8c46"] Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.733948 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d754-account-create-update-n6glx"] Dec 05 12:09:14 crc kubenswrapper[4763]: W1205 12:09:14.734899 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cdb2955_6d03_4169_8765_22f61729881f.slice/crio-87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5 WatchSource:0}: Error finding container 87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5: Status 404 returned error can't find the container with id 87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5 Dec 05 12:09:14 crc kubenswrapper[4763]: W1205 12:09:14.739902 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c2f44d_7371_42a1_b73b_2e68ba45adf4.slice/crio-6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c WatchSource:0}: Error finding container 6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c: Status 404 returned error can't find the container with id 6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.742585 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m5zs5"] Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.818351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7548-account-create-update-2cptw"] Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.827374 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-wcz88"] Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.852558 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fdrsd"] Dec 05 12:09:14 crc kubenswrapper[4763]: W1205 12:09:14.858501 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode051d182_dc55_4454_95aa_558c2e183c88.slice/crio-6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a WatchSource:0}: Error finding container 6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a: Status 404 returned error can't find the container with id 6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.938210 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerStarted","Data":"affa3a6edfeb78ee66b2218a7183c68c42d5b1779813fd8e7775533eb6cb891e"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.940889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wcz88" event={"ID":"4046982d-ad27-468f-897a-167692d9ae49","Type":"ContainerStarted","Data":"ed304d3164302c91cfe0a3ead8a3e42a7ed78a473c843ff56127a61d35194f9d"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.944251 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d754-account-create-update-n6glx" event={"ID":"5cdb2955-6d03-4169-8765-22f61729881f","Type":"ContainerStarted","Data":"96ded869b827b21f0d156220c2684fb377363684aa42c30ccd5a432c948389f9"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.944293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d754-account-create-update-n6glx" event={"ID":"5cdb2955-6d03-4169-8765-22f61729881f","Type":"ContainerStarted","Data":"87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.945841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8c46" event={"ID":"7464b1d7-23f8-4450-a41e-1208f89c1fe4","Type":"ContainerStarted","Data":"f5e2a39b790d557b3c61a5323578081385fbc53fa737de5f80d7c116ec51a30f"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.946593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrsd" event={"ID":"e051d182-dc55-4454-95aa-558c2e183c88","Type":"ContainerStarted","Data":"6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.949149 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c","Type":"ContainerStarted","Data":"acdb9230fc857b4f52a8b711ca9bce058b034280f3c2ebc20d3afd5cf6242f86"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.949696 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.964483 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.584476417 podStartE2EDuration="1m20.964466528s" podCreationTimestamp="2025-12-05 12:07:54 +0000 UTC" firstStartedPulling="2025-12-05 12:08:14.780120142 +0000 UTC m=+1179.272834865" lastFinishedPulling="2025-12-05 12:09:14.160110233 +0000 UTC m=+1238.652824976" observedRunningTime="2025-12-05 12:09:14.960439934 +0000 UTC m=+1239.453154657" watchObservedRunningTime="2025-12-05 12:09:14.964466528 +0000 UTC m=+1239.457181251" Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.970889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7548-account-create-update-2cptw" event={"ID":"5fec4184-203d-48c4-bf8a-39529d6d08ce","Type":"ContainerStarted","Data":"943b7d724a029f1f0ed741afb01bb985b8a4593ee458c691f4b34608a7f410ed"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.975914 4763 generic.go:334] "Generic (PLEG): container finished" podID="d64503e5-6fd1-49fa-b025-7c00f8b245c3" containerID="ba1988e2a85e403e1c48f58e086de951a0e1e20cecfbd24326a7eb4ccb0a4a9c" exitCode=0 Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.975987 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9c79-account-create-update-n7z54" event={"ID":"d64503e5-6fd1-49fa-b025-7c00f8b245c3","Type":"ContainerDied","Data":"ba1988e2a85e403e1c48f58e086de951a0e1e20cecfbd24326a7eb4ccb0a4a9c"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.976014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9c79-account-create-update-n7z54" event={"ID":"d64503e5-6fd1-49fa-b025-7c00f8b245c3","Type":"ContainerStarted","Data":"4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.979790 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m5zs5" event={"ID":"74c2f44d-7371-42a1-b73b-2e68ba45adf4","Type":"ContainerStarted","Data":"a3e9f709cbd920a47d757c5a4440af53da3bb50269b92d536c578329fb7a8ee1"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.979889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m5zs5" event={"ID":"74c2f44d-7371-42a1-b73b-2e68ba45adf4","Type":"ContainerStarted","Data":"6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.982310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s2dcm" event={"ID":"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3","Type":"ContainerStarted","Data":"f289628ba70dec2b7550756b7acf56cedc1caced9552e8da4e7759a2bf2ab5cf"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.982361 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s2dcm" event={"ID":"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3","Type":"ContainerStarted","Data":"5726da327c438eb966e5a0b8d21a5e6a825b7213d8646ad2985c0f301737ca72"} Dec 05 12:09:14 crc kubenswrapper[4763]: I1205 12:09:14.988530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pzxb5" event={"ID":"a38e41f6-6247-4c91-abba-0bc65d1c2127","Type":"ContainerStarted","Data":"f7fc18bb75418f544d3c156a9b5dd0f680b5ad05b98bf5d0340bd13e3a9e1003"} Dec 05 12:09:14 crc kubenswrapper[4763]: E1205 12:09:14.989554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-vh57r" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" Dec 05 12:09:15 crc kubenswrapper[4763]: I1205 12:09:15.005608 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.823783145 podStartE2EDuration="39.00559066s" podCreationTimestamp="2025-12-05 12:08:36 +0000 UTC" firstStartedPulling="2025-12-05 12:08:37.983672804 +0000 UTC m=+1202.476387517" lastFinishedPulling="2025-12-05 12:09:14.165480309 +0000 UTC m=+1238.658195032" observedRunningTime="2025-12-05 12:09:14.987798836 +0000 UTC m=+1239.480513559" watchObservedRunningTime="2025-12-05 12:09:15.00559066 +0000 UTC m=+1239.498305383" Dec 05 12:09:15 crc kubenswrapper[4763]: I1205 12:09:15.015802 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d754-account-create-update-n6glx" podStartSLOduration=6.015778562 podStartE2EDuration="6.015778562s" podCreationTimestamp="2025-12-05 12:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:15.009590114 +0000 UTC m=+1239.502304857" watchObservedRunningTime="2025-12-05 12:09:15.015778562 +0000 UTC m=+1239.508493295" Dec 05 12:09:15 crc kubenswrapper[4763]: I1205 12:09:15.050251 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-m5zs5" podStartSLOduration=6.050236437 podStartE2EDuration="6.050236437s" podCreationTimestamp="2025-12-05 12:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:15.045296207 +0000 UTC m=+1239.538010930" watchObservedRunningTime="2025-12-05 12:09:15.050236437 +0000 UTC m=+1239.542951160" Dec 05 12:09:15 crc kubenswrapper[4763]: I1205 12:09:15.060263 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-s2dcm" podStartSLOduration=5.060244882 podStartE2EDuration="5.060244882s" podCreationTimestamp="2025-12-05 12:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:15.059350288 +0000 UTC m=+1239.552065011" watchObservedRunningTime="2025-12-05 12:09:15.060244882 +0000 UTC m=+1239.552959615" Dec 05 12:09:15 crc kubenswrapper[4763]: I1205 12:09:15.103804 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pzxb5" podStartSLOduration=2.466369668 podStartE2EDuration="39.103784907s" podCreationTimestamp="2025-12-05 12:08:36 +0000 UTC" firstStartedPulling="2025-12-05 12:08:37.552569473 +0000 UTC m=+1202.045284196" lastFinishedPulling="2025-12-05 12:09:14.189984712 +0000 UTC m=+1238.682699435" observedRunningTime="2025-12-05 12:09:15.08750116 +0000 UTC m=+1239.580215883" watchObservedRunningTime="2025-12-05 12:09:15.103784907 +0000 UTC m=+1239.596499630" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.000031 4763 generic.go:334] "Generic (PLEG): container finished" podID="e051d182-dc55-4454-95aa-558c2e183c88" containerID="a89ad19746161315407ad87fb235afc0f76c7ea32f73853346ad2dd4d21985ae" exitCode=0 Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.000115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrsd" event={"ID":"e051d182-dc55-4454-95aa-558c2e183c88","Type":"ContainerDied","Data":"a89ad19746161315407ad87fb235afc0f76c7ea32f73853346ad2dd4d21985ae"} Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.004040 4763 generic.go:334] "Generic (PLEG): container finished" podID="5fec4184-203d-48c4-bf8a-39529d6d08ce" containerID="27e555347270a2b412989123148265e3d228bff992c120098ef9c73377f0e826" exitCode=0 Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.004139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7548-account-create-update-2cptw" event={"ID":"5fec4184-203d-48c4-bf8a-39529d6d08ce","Type":"ContainerDied","Data":"27e555347270a2b412989123148265e3d228bff992c120098ef9c73377f0e826"} Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.006238 4763 generic.go:334] "Generic (PLEG): container finished" podID="5cdb2955-6d03-4169-8765-22f61729881f" containerID="96ded869b827b21f0d156220c2684fb377363684aa42c30ccd5a432c948389f9" exitCode=0 Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.006289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d754-account-create-update-n6glx" event={"ID":"5cdb2955-6d03-4169-8765-22f61729881f","Type":"ContainerDied","Data":"96ded869b827b21f0d156220c2684fb377363684aa42c30ccd5a432c948389f9"} Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.014791 4763 generic.go:334] "Generic (PLEG): container finished" podID="74c2f44d-7371-42a1-b73b-2e68ba45adf4" containerID="a3e9f709cbd920a47d757c5a4440af53da3bb50269b92d536c578329fb7a8ee1" exitCode=0 Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.014850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m5zs5" event={"ID":"74c2f44d-7371-42a1-b73b-2e68ba45adf4","Type":"ContainerDied","Data":"a3e9f709cbd920a47d757c5a4440af53da3bb50269b92d536c578329fb7a8ee1"} Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.016424 4763 generic.go:334] "Generic (PLEG): container finished" podID="a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" containerID="f289628ba70dec2b7550756b7acf56cedc1caced9552e8da4e7759a2bf2ab5cf" exitCode=0 Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.016521 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s2dcm" event={"ID":"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3","Type":"ContainerDied","Data":"f289628ba70dec2b7550756b7acf56cedc1caced9552e8da4e7759a2bf2ab5cf"} Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.204941 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.369799 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.491782 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b88h2\" (UniqueName: \"kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2\") pod \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.491990 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts\") pod \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\" (UID: \"d64503e5-6fd1-49fa-b025-7c00f8b245c3\") " Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.492622 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d64503e5-6fd1-49fa-b025-7c00f8b245c3" (UID: "d64503e5-6fd1-49fa-b025-7c00f8b245c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.498406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2" (OuterVolumeSpecName: "kube-api-access-b88h2") pod "d64503e5-6fd1-49fa-b025-7c00f8b245c3" (UID: "d64503e5-6fd1-49fa-b025-7c00f8b245c3"). InnerVolumeSpecName "kube-api-access-b88h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.594446 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64503e5-6fd1-49fa-b025-7c00f8b245c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:16 crc kubenswrapper[4763]: I1205 12:09:16.594495 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b88h2\" (UniqueName: \"kubernetes.io/projected/d64503e5-6fd1-49fa-b025-7c00f8b245c3-kube-api-access-b88h2\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:17 crc kubenswrapper[4763]: I1205 12:09:17.025963 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9c79-account-create-update-n7z54" Dec 05 12:09:17 crc kubenswrapper[4763]: I1205 12:09:17.025961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9c79-account-create-update-n7z54" event={"ID":"d64503e5-6fd1-49fa-b025-7c00f8b245c3","Type":"ContainerDied","Data":"4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135"} Dec 05 12:09:17 crc kubenswrapper[4763]: I1205 12:09:17.026012 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee1c45e6cb8e5cb23619421051dc0651c3fe1648365c5745035aaa41befe135" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.493223 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.503378 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.518623 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.532679 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571358 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcrk9\" (UniqueName: \"kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9\") pod \"5fec4184-203d-48c4-bf8a-39529d6d08ce\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts\") pod \"e051d182-dc55-4454-95aa-558c2e183c88\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571460 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndj6r\" (UniqueName: \"kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r\") pod \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571502 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts\") pod \"5fec4184-203d-48c4-bf8a-39529d6d08ce\" (UID: \"5fec4184-203d-48c4-bf8a-39529d6d08ce\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tpkg\" (UniqueName: \"kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg\") pod \"e051d182-dc55-4454-95aa-558c2e183c88\" (UID: \"e051d182-dc55-4454-95aa-558c2e183c88\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.571656 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts\") pod \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\" (UID: \"74c2f44d-7371-42a1-b73b-2e68ba45adf4\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.573022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74c2f44d-7371-42a1-b73b-2e68ba45adf4" (UID: "74c2f44d-7371-42a1-b73b-2e68ba45adf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.573250 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fec4184-203d-48c4-bf8a-39529d6d08ce" (UID: "5fec4184-203d-48c4-bf8a-39529d6d08ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.573584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e051d182-dc55-4454-95aa-558c2e183c88" (UID: "e051d182-dc55-4454-95aa-558c2e183c88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.584796 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r" (OuterVolumeSpecName: "kube-api-access-ndj6r") pod "74c2f44d-7371-42a1-b73b-2e68ba45adf4" (UID: "74c2f44d-7371-42a1-b73b-2e68ba45adf4"). InnerVolumeSpecName "kube-api-access-ndj6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.585289 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg" (OuterVolumeSpecName: "kube-api-access-8tpkg") pod "e051d182-dc55-4454-95aa-558c2e183c88" (UID: "e051d182-dc55-4454-95aa-558c2e183c88"). InnerVolumeSpecName "kube-api-access-8tpkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.593676 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9" (OuterVolumeSpecName: "kube-api-access-jcrk9") pod "5fec4184-203d-48c4-bf8a-39529d6d08ce" (UID: "5fec4184-203d-48c4-bf8a-39529d6d08ce"). InnerVolumeSpecName "kube-api-access-jcrk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.673462 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts\") pod \"5cdb2955-6d03-4169-8765-22f61729881f\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.673545 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7mpv\" (UniqueName: \"kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv\") pod \"5cdb2955-6d03-4169-8765-22f61729881f\" (UID: \"5cdb2955-6d03-4169-8765-22f61729881f\") " Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674074 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tpkg\" (UniqueName: \"kubernetes.io/projected/e051d182-dc55-4454-95aa-558c2e183c88-kube-api-access-8tpkg\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674098 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74c2f44d-7371-42a1-b73b-2e68ba45adf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674111 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcrk9\" (UniqueName: \"kubernetes.io/projected/5fec4184-203d-48c4-bf8a-39529d6d08ce-kube-api-access-jcrk9\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674122 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e051d182-dc55-4454-95aa-558c2e183c88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674136 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndj6r\" (UniqueName: \"kubernetes.io/projected/74c2f44d-7371-42a1-b73b-2e68ba45adf4-kube-api-access-ndj6r\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.674147 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fec4184-203d-48c4-bf8a-39529d6d08ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.675187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cdb2955-6d03-4169-8765-22f61729881f" (UID: "5cdb2955-6d03-4169-8765-22f61729881f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.678038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv" (OuterVolumeSpecName: "kube-api-access-v7mpv") pod "5cdb2955-6d03-4169-8765-22f61729881f" (UID: "5cdb2955-6d03-4169-8765-22f61729881f"). InnerVolumeSpecName "kube-api-access-v7mpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.776703 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdb2955-6d03-4169-8765-22f61729881f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:20 crc kubenswrapper[4763]: I1205 12:09:20.776741 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7mpv\" (UniqueName: \"kubernetes.io/projected/5cdb2955-6d03-4169-8765-22f61729881f-kube-api-access-v7mpv\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.077365 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrsd" event={"ID":"e051d182-dc55-4454-95aa-558c2e183c88","Type":"ContainerDied","Data":"6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a"} Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.077806 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6500bbe8488d2af6e85523c3ad39610abc590f32e097d6df514e8a2646ee985a" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.077375 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrsd" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.080571 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7548-account-create-update-2cptw" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.080670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7548-account-create-update-2cptw" event={"ID":"5fec4184-203d-48c4-bf8a-39529d6d08ce","Type":"ContainerDied","Data":"943b7d724a029f1f0ed741afb01bb985b8a4593ee458c691f4b34608a7f410ed"} Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.080695 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943b7d724a029f1f0ed741afb01bb985b8a4593ee458c691f4b34608a7f410ed" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.082963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d754-account-create-update-n6glx" event={"ID":"5cdb2955-6d03-4169-8765-22f61729881f","Type":"ContainerDied","Data":"87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5"} Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.083002 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87063959f729e8a81d0400da22d095096059f4e1ef98bb25fe6f089b351771b5" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.083048 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d754-account-create-update-n6glx" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.086357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m5zs5" event={"ID":"74c2f44d-7371-42a1-b73b-2e68ba45adf4","Type":"ContainerDied","Data":"6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c"} Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.086380 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5aaf476218e735259032b0c3ed95d649218c7b85de12845cddbee4a1f41f9c" Dec 05 12:09:21 crc kubenswrapper[4763]: I1205 12:09:21.086414 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m5zs5" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.028373 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.121984 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s2dcm" event={"ID":"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3","Type":"ContainerDied","Data":"5726da327c438eb966e5a0b8d21a5e6a825b7213d8646ad2985c0f301737ca72"} Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.122022 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5726da327c438eb966e5a0b8d21a5e6a825b7213d8646ad2985c0f301737ca72" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.122000 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s2dcm" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.123575 4763 generic.go:334] "Generic (PLEG): container finished" podID="a38e41f6-6247-4c91-abba-0bc65d1c2127" containerID="f7fc18bb75418f544d3c156a9b5dd0f680b5ad05b98bf5d0340bd13e3a9e1003" exitCode=0 Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.123616 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pzxb5" event={"ID":"a38e41f6-6247-4c91-abba-0bc65d1c2127","Type":"ContainerDied","Data":"f7fc18bb75418f544d3c156a9b5dd0f680b5ad05b98bf5d0340bd13e3a9e1003"} Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.156835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts\") pod \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.156949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv58h\" (UniqueName: \"kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h\") pod \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\" (UID: \"a7b68be3-b684-41b3-9cb0-6ae8f6f998f3\") " Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.157820 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" (UID: "a7b68be3-b684-41b3-9cb0-6ae8f6f998f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.162788 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h" (OuterVolumeSpecName: "kube-api-access-xv58h") pod "a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" (UID: "a7b68be3-b684-41b3-9cb0-6ae8f6f998f3"). InnerVolumeSpecName "kube-api-access-xv58h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.259065 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv58h\" (UniqueName: \"kubernetes.io/projected/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-kube-api-access-xv58h\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:25 crc kubenswrapper[4763]: I1205 12:09:25.259108 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:26 crc kubenswrapper[4763]: I1205 12:09:26.204647 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:26 crc kubenswrapper[4763]: I1205 12:09:26.209341 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.138614 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.309186 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.336559 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.412736 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8fzs\" (UniqueName: \"kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.412872 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.412920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.412961 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.412998 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.413083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.413149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts\") pod \"a38e41f6-6247-4c91-abba-0bc65d1c2127\" (UID: \"a38e41f6-6247-4c91-abba-0bc65d1c2127\") " Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.413468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.413663 4763 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.414098 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.418678 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs" (OuterVolumeSpecName: "kube-api-access-n8fzs") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "kube-api-access-n8fzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.424727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.437640 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts" (OuterVolumeSpecName: "scripts") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.442087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.444737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a38e41f6-6247-4c91-abba-0bc65d1c2127" (UID: "a38e41f6-6247-4c91-abba-0bc65d1c2127"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514874 4763 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514919 4763 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514930 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38e41f6-6247-4c91-abba-0bc65d1c2127-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514939 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8fzs\" (UniqueName: \"kubernetes.io/projected/a38e41f6-6247-4c91-abba-0bc65d1c2127-kube-api-access-n8fzs\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514949 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a38e41f6-6247-4c91-abba-0bc65d1c2127-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:27 crc kubenswrapper[4763]: I1205 12:09:27.514957 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e41f6-6247-4c91-abba-0bc65d1c2127-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:28 crc kubenswrapper[4763]: I1205 12:09:28.151938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pzxb5" event={"ID":"a38e41f6-6247-4c91-abba-0bc65d1c2127","Type":"ContainerDied","Data":"e638ef0763633cea8fea65cd7e393b2e6f4be9cc45a1510f55079f9db18fb1a5"} Dec 05 12:09:28 crc kubenswrapper[4763]: I1205 12:09:28.151990 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e638ef0763633cea8fea65cd7e393b2e6f4be9cc45a1510f55079f9db18fb1a5" Dec 05 12:09:28 crc kubenswrapper[4763]: I1205 12:09:28.151991 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pzxb5" Dec 05 12:09:29 crc kubenswrapper[4763]: I1205 12:09:29.320616 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:29 crc kubenswrapper[4763]: I1205 12:09:29.321196 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="config-reloader" containerID="cri-o://4ca686e20e5eb1bf421130ea665b4cf9f3e9ae722261772a65c247dd39e26c24" gracePeriod=600 Dec 05 12:09:29 crc kubenswrapper[4763]: I1205 12:09:29.321352 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="prometheus" containerID="cri-o://affa3a6edfeb78ee66b2218a7183c68c42d5b1779813fd8e7775533eb6cb891e" gracePeriod=600 Dec 05 12:09:29 crc kubenswrapper[4763]: I1205 12:09:29.321434 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="thanos-sidecar" containerID="cri-o://eb621e9277be1610b485a0015c8f75be9dd554027d30b32d46abff3571cc4c29" gracePeriod=600 Dec 05 12:09:30 crc kubenswrapper[4763]: I1205 12:09:30.172659 4763 generic.go:334] "Generic (PLEG): container finished" podID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerID="eb621e9277be1610b485a0015c8f75be9dd554027d30b32d46abff3571cc4c29" exitCode=0 Dec 05 12:09:30 crc kubenswrapper[4763]: I1205 12:09:30.172740 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerDied","Data":"eb621e9277be1610b485a0015c8f75be9dd554027d30b32d46abff3571cc4c29"} Dec 05 12:09:31 crc kubenswrapper[4763]: I1205 12:09:31.184591 4763 generic.go:334] "Generic (PLEG): container finished" podID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerID="affa3a6edfeb78ee66b2218a7183c68c42d5b1779813fd8e7775533eb6cb891e" exitCode=0 Dec 05 12:09:31 crc kubenswrapper[4763]: I1205 12:09:31.184624 4763 generic.go:334] "Generic (PLEG): container finished" podID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerID="4ca686e20e5eb1bf421130ea665b4cf9f3e9ae722261772a65c247dd39e26c24" exitCode=0 Dec 05 12:09:31 crc kubenswrapper[4763]: I1205 12:09:31.184647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerDied","Data":"affa3a6edfeb78ee66b2218a7183c68c42d5b1779813fd8e7775533eb6cb891e"} Dec 05 12:09:31 crc kubenswrapper[4763]: I1205 12:09:31.184676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerDied","Data":"4ca686e20e5eb1bf421130ea665b4cf9f3e9ae722261772a65c247dd39e26c24"} Dec 05 12:09:31 crc kubenswrapper[4763]: I1205 12:09:31.204889 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.110:9090/-/ready\": dial tcp 10.217.0.110:9090: connect: connection refused" Dec 05 12:09:32 crc kubenswrapper[4763]: E1205 12:09:32.108818 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.70:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 05 12:09:32 crc kubenswrapper[4763]: E1205 12:09:32.109124 4763 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.70:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 05 12:09:32 crc kubenswrapper[4763]: E1205 12:09:32.109246 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.70:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jhbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-wcz88_openstack(4046982d-ad27-468f-897a-167692d9ae49): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:09:32 crc kubenswrapper[4763]: E1205 12:09:32.110548 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-wcz88" podUID="4046982d-ad27-468f-897a-167692d9ae49" Dec 05 12:09:32 crc kubenswrapper[4763]: E1205 12:09:32.236444 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.70:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-wcz88" podUID="4046982d-ad27-468f-897a-167692d9ae49" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.413222 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531683 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlbjb\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531834 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531911 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531936 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.531976 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.532009 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.532033 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file\") pod \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\" (UID: \"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc\") " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.536482 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.540087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config" (OuterVolumeSpecName: "config") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.544247 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.545903 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.546323 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb" (OuterVolumeSpecName: "kube-api-access-hlbjb") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "kube-api-access-hlbjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.560459 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "pvc-ff5f488f-885a-43ee-9a04-43ba44d78789". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.575721 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config" (OuterVolumeSpecName: "web-config") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.579627 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out" (OuterVolumeSpecName: "config-out") pod "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" (UID: "9d77e8a1-26ef-4525-b427-0a29a9b7a0fc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633585 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") on node \"crc\" " Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633626 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlbjb\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-kube-api-access-hlbjb\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633636 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633645 4763 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633654 4763 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633663 4763 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633674 4763 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.633685 4763 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.663213 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.663411 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ff5f488f-885a-43ee-9a04-43ba44d78789" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789") on node "crc" Dec 05 12:09:32 crc kubenswrapper[4763]: I1205 12:09:32.735265 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.253370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8c46" event={"ID":"7464b1d7-23f8-4450-a41e-1208f89c1fe4","Type":"ContainerStarted","Data":"cdaa9c5da9e74fcc5fd493a87c06dd6f8a834444310999188ed4381c90c0ae4f"} Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.260680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9d77e8a1-26ef-4525-b427-0a29a9b7a0fc","Type":"ContainerDied","Data":"3e1a697547d2f63fbb07348855dbb19c70d0c506b9124734a4e4b79ee1de20bd"} Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.260715 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.260734 4763 scope.go:117] "RemoveContainer" containerID="affa3a6edfeb78ee66b2218a7183c68c42d5b1779813fd8e7775533eb6cb891e" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.262494 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vh57r" event={"ID":"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1","Type":"ContainerStarted","Data":"36ce7c10d382b350c9957a9b4b25836f63146f81b92abf70b2d1054582c8f727"} Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.279052 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-d8c46" podStartSLOduration=6.430868586 podStartE2EDuration="23.27903303s" podCreationTimestamp="2025-12-05 12:09:10 +0000 UTC" firstStartedPulling="2025-12-05 12:09:14.622655552 +0000 UTC m=+1239.115370275" lastFinishedPulling="2025-12-05 12:09:31.470820006 +0000 UTC m=+1255.963534719" observedRunningTime="2025-12-05 12:09:33.273201236 +0000 UTC m=+1257.765915959" watchObservedRunningTime="2025-12-05 12:09:33.27903303 +0000 UTC m=+1257.771747753" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.291572 4763 scope.go:117] "RemoveContainer" containerID="eb621e9277be1610b485a0015c8f75be9dd554027d30b32d46abff3571cc4c29" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.302641 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vh57r" podStartSLOduration=3.049516247 podStartE2EDuration="36.302619637s" podCreationTimestamp="2025-12-05 12:08:57 +0000 UTC" firstStartedPulling="2025-12-05 12:08:58.880462635 +0000 UTC m=+1223.373177358" lastFinishedPulling="2025-12-05 12:09:32.133566025 +0000 UTC m=+1256.626280748" observedRunningTime="2025-12-05 12:09:33.293420113 +0000 UTC m=+1257.786134836" watchObservedRunningTime="2025-12-05 12:09:33.302619637 +0000 UTC m=+1257.795334360" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.328617 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.339942 4763 scope.go:117] "RemoveContainer" containerID="4ca686e20e5eb1bf421130ea665b4cf9f3e9ae722261772a65c247dd39e26c24" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.342275 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372543 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.372908 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="init-config-reloader" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372925 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="init-config-reloader" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.372938 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e051d182-dc55-4454-95aa-558c2e183c88" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e051d182-dc55-4454-95aa-558c2e183c88" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.372954 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="prometheus" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372961 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="prometheus" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.372976 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdb2955-6d03-4169-8765-22f61729881f" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372982 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdb2955-6d03-4169-8765-22f61729881f" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.372992 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="config-reloader" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.372997 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="config-reloader" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.373006 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec4184-203d-48c4-bf8a-39529d6d08ce" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.373012 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec4184-203d-48c4-bf8a-39529d6d08ce" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.373023 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c2f44d-7371-42a1-b73b-2e68ba45adf4" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.373029 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c2f44d-7371-42a1-b73b-2e68ba45adf4" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.373035 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a38e41f6-6247-4c91-abba-0bc65d1c2127" containerName="swift-ring-rebalance" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.373041 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38e41f6-6247-4c91-abba-0bc65d1c2127" containerName="swift-ring-rebalance" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.388084 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388120 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.388134 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="thanos-sidecar" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388141 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="thanos-sidecar" Dec 05 12:09:33 crc kubenswrapper[4763]: E1205 12:09:33.388153 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64503e5-6fd1-49fa-b025-7c00f8b245c3" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388161 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64503e5-6fd1-49fa-b025-7c00f8b245c3" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388462 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a38e41f6-6247-4c91-abba-0bc65d1c2127" containerName="swift-ring-rebalance" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388484 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdb2955-6d03-4169-8765-22f61729881f" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388494 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388512 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c2f44d-7371-42a1-b73b-2e68ba45adf4" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388518 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e051d182-dc55-4454-95aa-558c2e183c88" containerName="mariadb-database-create" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388530 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fec4184-203d-48c4-bf8a-39529d6d08ce" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388540 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="thanos-sidecar" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388550 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="config-reloader" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388557 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" containerName="prometheus" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.388565 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64503e5-6fd1-49fa-b025-7c00f8b245c3" containerName="mariadb-account-create-update" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.390089 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.398362 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.398806 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.399020 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.399112 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.399863 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.399932 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-htrxx" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.404546 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.404849 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.410352 4763 scope.go:117] "RemoveContainer" containerID="1c3a4d300fe3ae6cea13fc898b997f20c96630fc0223ec660a4bf7ef219e1008" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452185 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxpt\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452394 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452410 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.452449 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554002 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554165 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554309 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxpt\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.554439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.555831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.559916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.560774 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.560812 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f78e37f2b579c8ad937827f0ee3e8c91bfbfeaf465b3046f4f7d0e3c34229d24/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.561449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.563977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.564420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.564589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.566417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.566570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.576224 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.576783 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxpt\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.647471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.787788 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:09:33 crc kubenswrapper[4763]: I1205 12:09:33.797806 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d77e8a1-26ef-4525-b427-0a29a9b7a0fc" path="/var/lib/kubelet/pods/9d77e8a1-26ef-4525-b427-0a29a9b7a0fc/volumes" Dec 05 12:09:34 crc kubenswrapper[4763]: I1205 12:09:34.275756 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:09:34 crc kubenswrapper[4763]: W1205 12:09:34.281400 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod654b7fdf_0324_40f6_8681_3ba17e042d60.slice/crio-005dd38f7ddeeb35b9da5b02caeccd38aae1a6d352b5db4e57ce832a3343fce8 WatchSource:0}: Error finding container 005dd38f7ddeeb35b9da5b02caeccd38aae1a6d352b5db4e57ce832a3343fce8: Status 404 returned error can't find the container with id 005dd38f7ddeeb35b9da5b02caeccd38aae1a6d352b5db4e57ce832a3343fce8 Dec 05 12:09:35 crc kubenswrapper[4763]: I1205 12:09:35.280804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerStarted","Data":"005dd38f7ddeeb35b9da5b02caeccd38aae1a6d352b5db4e57ce832a3343fce8"} Dec 05 12:09:36 crc kubenswrapper[4763]: I1205 12:09:36.290558 4763 generic.go:334] "Generic (PLEG): container finished" podID="7464b1d7-23f8-4450-a41e-1208f89c1fe4" containerID="cdaa9c5da9e74fcc5fd493a87c06dd6f8a834444310999188ed4381c90c0ae4f" exitCode=0 Dec 05 12:09:36 crc kubenswrapper[4763]: I1205 12:09:36.290639 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8c46" event={"ID":"7464b1d7-23f8-4450-a41e-1208f89c1fe4","Type":"ContainerDied","Data":"cdaa9c5da9e74fcc5fd493a87c06dd6f8a834444310999188ed4381c90c0ae4f"} Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.302809 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerStarted","Data":"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21"} Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.657285 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.730164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9fj\" (UniqueName: \"kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj\") pod \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.730348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data\") pod \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.730413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle\") pod \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\" (UID: \"7464b1d7-23f8-4450-a41e-1208f89c1fe4\") " Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.742094 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj" (OuterVolumeSpecName: "kube-api-access-jt9fj") pod "7464b1d7-23f8-4450-a41e-1208f89c1fe4" (UID: "7464b1d7-23f8-4450-a41e-1208f89c1fe4"). InnerVolumeSpecName "kube-api-access-jt9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.780059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7464b1d7-23f8-4450-a41e-1208f89c1fe4" (UID: "7464b1d7-23f8-4450-a41e-1208f89c1fe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.788495 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data" (OuterVolumeSpecName: "config-data") pod "7464b1d7-23f8-4450-a41e-1208f89c1fe4" (UID: "7464b1d7-23f8-4450-a41e-1208f89c1fe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.832125 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.832170 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7464b1d7-23f8-4450-a41e-1208f89c1fe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:37 crc kubenswrapper[4763]: I1205 12:09:37.832187 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9fj\" (UniqueName: \"kubernetes.io/projected/7464b1d7-23f8-4450-a41e-1208f89c1fe4-kube-api-access-jt9fj\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.316631 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8c46" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.316720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8c46" event={"ID":"7464b1d7-23f8-4450-a41e-1208f89c1fe4","Type":"ContainerDied","Data":"f5e2a39b790d557b3c61a5323578081385fbc53fa737de5f80d7c116ec51a30f"} Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.317893 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5e2a39b790d557b3c61a5323578081385fbc53fa737de5f80d7c116ec51a30f" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.589913 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:38 crc kubenswrapper[4763]: E1205 12:09:38.590224 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7464b1d7-23f8-4450-a41e-1208f89c1fe4" containerName="keystone-db-sync" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.590237 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7464b1d7-23f8-4450-a41e-1208f89c1fe4" containerName="keystone-db-sync" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.590418 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7464b1d7-23f8-4450-a41e-1208f89c1fe4" containerName="keystone-db-sync" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.593736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.627719 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.665111 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.665173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.665221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdngk\" (UniqueName: \"kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.665273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.665389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.696835 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4b6qz"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.698351 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.702003 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.702215 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.706893 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gw888" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.707106 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.707143 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.720482 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4b6qz"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdngk\" (UniqueName: \"kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769308 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4x9\" (UniqueName: \"kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.769527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.770437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.770732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.777972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.778108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.814417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdngk\" (UniqueName: \"kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk\") pod \"dnsmasq-dns-5c9d85d47c-nm8gv\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.869381 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.870799 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871307 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4x9\" (UniqueName: \"kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.871489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.875351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-nn549" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.878305 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.878481 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.882298 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-cprq9"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.890322 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.892360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.894258 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.896127 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.897167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.904236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.904514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.904772 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.905817 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.908971 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-64ztw" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.911088 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cprq9"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.922307 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.955859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.962407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4x9\" (UniqueName: \"kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9\") pod \"keystone-bootstrap-4b6qz\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.981994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982072 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982102 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvgz\" (UniqueName: \"kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982182 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492vf\" (UniqueName: \"kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:38 crc kubenswrapper[4763]: I1205 12:09:38.982322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.033212 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.035569 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.036313 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.047205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.047493 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.073149 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8j52\" (UniqueName: \"kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084305 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvgz\" (UniqueName: \"kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492vf\" (UniqueName: \"kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084512 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084535 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.084591 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.098378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.098677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.099472 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.099505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.099523 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.120518 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.136706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.142753 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492vf\" (UniqueName: \"kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.144637 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.152103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvgz\" (UniqueName: \"kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz\") pod \"horizon-5d6ccfb9d9-bqng6\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.178823 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data\") pod \"cinder-db-sync-cprq9\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.182979 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wgf85"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.184940 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186175 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8j52\" (UniqueName: \"kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186297 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186460 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.186622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.188711 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.193209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.188976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.191722 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrxzg" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.199778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.219031 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.221928 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.228603 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.228904 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.229310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8j52\" (UniqueName: \"kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.230821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.236574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.248362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wgf85"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.273389 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.347815 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.348018 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgsw7\" (UniqueName: \"kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.348051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.348161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqwx\" (UniqueName: \"kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.349232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.349442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.349540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.349578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.389541 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.429908 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nvqn2"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.457463 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.458661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cprq9" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.463386 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.463694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-55jfq" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.463845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465212 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465327 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgsw7\" (UniqueName: \"kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465407 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqwx\" (UniqueName: \"kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465431 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465475 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465579 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.465629 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7t8z\" (UniqueName: \"kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.467883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.472254 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.474588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.480918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nvqn2"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.485183 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.486492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.486959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.490819 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.495724 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.498097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.504036 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.518751 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.531992 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-x6zwz"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.532330 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgsw7\" (UniqueName: \"kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7\") pod \"barbican-db-sync-wgf85\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.535588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.537471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqwx\" (UniqueName: \"kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx\") pod \"horizon-646fbb8979-kfn2p\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.544273 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-x6zwz"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.545541 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.545780 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-drpkw" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.546793 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.563864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.574303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgf85" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.577100 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7t8z\" (UniqueName: \"kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.577228 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.577258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.577340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.577391 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.580725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.596384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.599410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.619048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.624510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7t8z\" (UniqueName: \"kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z\") pod \"placement-db-sync-nvqn2\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.678869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679150 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679242 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86jq4\" (UniqueName: \"kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679339 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679613 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.679704 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xr6b\" (UniqueName: \"kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785239 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785345 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86jq4\" (UniqueName: \"kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785479 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xr6b\" (UniqueName: \"kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.785915 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.786738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.787419 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.790552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.791112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.801210 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nvqn2" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.814629 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.816522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.832436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86jq4\" (UniqueName: \"kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4\") pod \"neutron-db-sync-x6zwz\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.832732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xr6b\" (UniqueName: \"kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b\") pod \"dnsmasq-dns-6ffb94d8ff-ls64w\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.853833 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.881523 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:09:39 crc kubenswrapper[4763]: I1205 12:09:39.956912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4b6qz"] Dec 05 12:09:40 crc kubenswrapper[4763]: W1205 12:09:40.067916 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165a37f9_fedd_4880_b696_819f0af56cda.slice/crio-1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d WatchSource:0}: Error finding container 1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d: Status 404 returned error can't find the container with id 1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.207946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.234081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cprq9"] Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.252045 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1851124e-2722-4628-8e5b-63edb828d64a-etc-swift\") pod \"swift-storage-0\" (UID: \"1851124e-2722-4628-8e5b-63edb828d64a\") " pod="openstack/swift-storage-0" Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.420152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.504990 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cprq9" event={"ID":"10d49525-ec1b-4c52-8221-f3f0bb57e574","Type":"ContainerStarted","Data":"ace80f83bd0bec90e206ccf07a9f4b0ac704e7dfebde227d2c080f8f560d9288"} Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.514444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4b6qz" event={"ID":"165a37f9-fedd-4880-b696-819f0af56cda","Type":"ContainerStarted","Data":"1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d"} Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.519632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" event={"ID":"3aa52597-de66-4326-bf14-25fe93642499","Type":"ContainerStarted","Data":"24e5190492e158e7704c1df0e7b01853d76379fd4da442ff122d3cb85d5a078c"} Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.533996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wgf85"] Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.541151 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.644927 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:09:40 crc kubenswrapper[4763]: W1205 12:09:40.660910 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8f26ac_3463_4d42_936d_420cfdbd81eb.slice/crio-4a61f0da24f297ec4f874174ec449d5d0a457807bfcf25f4e852279f2dd871a4 WatchSource:0}: Error finding container 4a61f0da24f297ec4f874174ec449d5d0a457807bfcf25f4e852279f2dd871a4: Status 404 returned error can't find the container with id 4a61f0da24f297ec4f874174ec449d5d0a457807bfcf25f4e852279f2dd871a4 Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.815114 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:09:40 crc kubenswrapper[4763]: I1205 12:09:40.828379 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nvqn2"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.012857 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-x6zwz"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.022190 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:09:41 crc kubenswrapper[4763]: W1205 12:09:41.032005 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274fe292_e3f0_432c_9947_3bca5514f6d9.slice/crio-ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020 WatchSource:0}: Error finding container ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020: Status 404 returned error can't find the container with id ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020 Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.119965 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.412664 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.456339 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.458204 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.479080 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.555951 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.556008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.556032 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.556065 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.556106 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgh6\" (UniqueName: \"kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.569289 4763 generic.go:334] "Generic (PLEG): container finished" podID="3aa52597-de66-4326-bf14-25fe93642499" containerID="3aa995bb803acc474929d923482294b4a0475a7f2e7a76e45a364934cf6ad87d" exitCode=0 Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.569373 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" event={"ID":"3aa52597-de66-4326-bf14-25fe93642499","Type":"ContainerDied","Data":"3aa995bb803acc474929d923482294b4a0475a7f2e7a76e45a364934cf6ad87d"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.585946 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgf85" event={"ID":"1f3d2c51-7840-4854-af9e-e0da6c484074","Type":"ContainerStarted","Data":"07047094bb3dee3d5aded49846bd8646907be1c814de1542fc99c916da925e27"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.596144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nvqn2" event={"ID":"b2bf0108-5266-4f52-8803-39a842ddc777","Type":"ContainerStarted","Data":"cc76d5fdfc04a366476490a65c03203688e95857b0ead2447fa481241e7d429c"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.637702 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4b6qz" event={"ID":"165a37f9-fedd-4880-b696-819f0af56cda","Type":"ContainerStarted","Data":"af09e34a4d4036881af595378a01e6869cd5e9c873a6faf16623dcd01918e2bb"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.655648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d6ccfb9d9-bqng6" event={"ID":"a5583b19-1a4c-4b4f-9147-fecc8cd733a1","Type":"ContainerStarted","Data":"f3020dcf02a9eab2791cdf74d40c301fb1a8d08e32c5621708f1238cbf7382d9"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.658036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.658147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.658190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.658247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgh6\" (UniqueName: \"kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.658368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.659799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.661034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.661308 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.674445 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.677050 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4b6qz" podStartSLOduration=3.677030414 podStartE2EDuration="3.677030414s" podCreationTimestamp="2025-12-05 12:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:41.661340331 +0000 UTC m=+1266.154055054" watchObservedRunningTime="2025-12-05 12:09:41.677030414 +0000 UTC m=+1266.169745147" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.688049 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgh6\" (UniqueName: \"kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6\") pod \"horizon-774db4cc99-rvp4p\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.688166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerStarted","Data":"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.688228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerStarted","Data":"ccc29b8081230c4b9c91f1aeb1da72b754b9f20ba80aae41cea0993c29c3f396"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.692534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x6zwz" event={"ID":"274fe292-e3f0-432c-9947-3bca5514f6d9","Type":"ContainerStarted","Data":"7b00072b9f7e2f54d96cbe80dd6024fdf58ccdadb25eedc84fe24bc238c8501e"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.692568 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x6zwz" event={"ID":"274fe292-e3f0-432c-9947-3bca5514f6d9","Type":"ContainerStarted","Data":"ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.694596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"cc48489923ba2d2c10fb6ad0bdcdfbaabf61684a8214b5963e1bf27683ab23fd"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.695810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-646fbb8979-kfn2p" event={"ID":"52fb7404-f381-4887-aad1-d761e1997b6a","Type":"ContainerStarted","Data":"9708a8247ab8dc830783e1f66379cebb023642cbffe9b1ba249f755c7bbe7c65"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.697968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerStarted","Data":"4a61f0da24f297ec4f874174ec449d5d0a457807bfcf25f4e852279f2dd871a4"} Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.764787 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-x6zwz" podStartSLOduration=2.764747728 podStartE2EDuration="2.764747728s" podCreationTimestamp="2025-12-05 12:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:41.751834281 +0000 UTC m=+1266.244549214" watchObservedRunningTime="2025-12-05 12:09:41.764747728 +0000 UTC m=+1266.257462451" Dec 05 12:09:41 crc kubenswrapper[4763]: I1205 12:09:41.804463 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.123553 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.267882 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.385385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config\") pod \"3aa52597-de66-4326-bf14-25fe93642499\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.385513 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb\") pod \"3aa52597-de66-4326-bf14-25fe93642499\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.385591 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc\") pod \"3aa52597-de66-4326-bf14-25fe93642499\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.385654 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdngk\" (UniqueName: \"kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk\") pod \"3aa52597-de66-4326-bf14-25fe93642499\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.385689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb\") pod \"3aa52597-de66-4326-bf14-25fe93642499\" (UID: \"3aa52597-de66-4326-bf14-25fe93642499\") " Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.438142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3aa52597-de66-4326-bf14-25fe93642499" (UID: "3aa52597-de66-4326-bf14-25fe93642499"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.470498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk" (OuterVolumeSpecName: "kube-api-access-tdngk") pod "3aa52597-de66-4326-bf14-25fe93642499" (UID: "3aa52597-de66-4326-bf14-25fe93642499"). InnerVolumeSpecName "kube-api-access-tdngk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.491237 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdngk\" (UniqueName: \"kubernetes.io/projected/3aa52597-de66-4326-bf14-25fe93642499-kube-api-access-tdngk\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.491270 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.650049 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config" (OuterVolumeSpecName: "config") pod "3aa52597-de66-4326-bf14-25fe93642499" (UID: "3aa52597-de66-4326-bf14-25fe93642499"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.651139 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3aa52597-de66-4326-bf14-25fe93642499" (UID: "3aa52597-de66-4326-bf14-25fe93642499"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.651341 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.656668 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3aa52597-de66-4326-bf14-25fe93642499" (UID: "3aa52597-de66-4326-bf14-25fe93642499"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.705079 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.705118 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.705127 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aa52597-de66-4326-bf14-25fe93642499-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.716288 4763 generic.go:334] "Generic (PLEG): container finished" podID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerID="cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28" exitCode=0 Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.716408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerDied","Data":"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28"} Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.716462 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerStarted","Data":"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a"} Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.717627 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.726046 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.726199 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nm8gv" event={"ID":"3aa52597-de66-4326-bf14-25fe93642499","Type":"ContainerDied","Data":"24e5190492e158e7704c1df0e7b01853d76379fd4da442ff122d3cb85d5a078c"} Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.726268 4763 scope.go:117] "RemoveContainer" containerID="3aa995bb803acc474929d923482294b4a0475a7f2e7a76e45a364934cf6ad87d" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.746432 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" podStartSLOduration=3.746412043 podStartE2EDuration="3.746412043s" podCreationTimestamp="2025-12-05 12:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:09:42.737061973 +0000 UTC m=+1267.229776696" watchObservedRunningTime="2025-12-05 12:09:42.746412043 +0000 UTC m=+1267.239126776" Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.795601 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:42 crc kubenswrapper[4763]: I1205 12:09:42.817514 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nm8gv"] Dec 05 12:09:43 crc kubenswrapper[4763]: I1205 12:09:43.741033 4763 generic.go:334] "Generic (PLEG): container finished" podID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerID="0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21" exitCode=0 Dec 05 12:09:43 crc kubenswrapper[4763]: I1205 12:09:43.741941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerDied","Data":"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21"} Dec 05 12:09:43 crc kubenswrapper[4763]: I1205 12:09:43.755683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-774db4cc99-rvp4p" event={"ID":"e8979c4b-8aad-4db5-ae70-b695188b6079","Type":"ContainerStarted","Data":"c61619766c1c417ca092bb3eba83f28db510c9277b08109b0eb8a202e65534f7"} Dec 05 12:09:43 crc kubenswrapper[4763]: I1205 12:09:43.857477 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa52597-de66-4326-bf14-25fe93642499" path="/var/lib/kubelet/pods/3aa52597-de66-4326-bf14-25fe93642499/volumes" Dec 05 12:09:45 crc kubenswrapper[4763]: I1205 12:09:45.781920 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"e75fdc9c1fc995e821d68b1914900fea1c981457a61bbf946ddb30b1db25d3d7"} Dec 05 12:09:45 crc kubenswrapper[4763]: I1205 12:09:45.782427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"293f27bbb7c1443b25272fe2cc318e55f9ef2b77ddb5f42d5675fdbeceb8bda0"} Dec 05 12:09:45 crc kubenswrapper[4763]: I1205 12:09:45.783750 4763 generic.go:334] "Generic (PLEG): container finished" podID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" containerID="36ce7c10d382b350c9957a9b4b25836f63146f81b92abf70b2d1054582c8f727" exitCode=0 Dec 05 12:09:45 crc kubenswrapper[4763]: I1205 12:09:45.783826 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vh57r" event={"ID":"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1","Type":"ContainerDied","Data":"36ce7c10d382b350c9957a9b4b25836f63146f81b92abf70b2d1054582c8f727"} Dec 05 12:09:45 crc kubenswrapper[4763]: I1205 12:09:45.788116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerStarted","Data":"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4"} Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.289001 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.342144 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:09:47 crc kubenswrapper[4763]: E1205 12:09:47.342544 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa52597-de66-4326-bf14-25fe93642499" containerName="init" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.342563 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa52597-de66-4326-bf14-25fe93642499" containerName="init" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.342750 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa52597-de66-4326-bf14-25fe93642499" containerName="init" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.343941 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.349105 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.350826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402612 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402741 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.402894 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8l8\" (UniqueName: \"kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.420479 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.437435 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77dcd5c496-hs7bj"] Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.440453 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.456977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77dcd5c496-hs7bj"] Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.504863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.504923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34428a2-5423-401a-b7d3-aebd1d070945-logs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.504957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.504985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-secret-key\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8l8\" (UniqueName: \"kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6w54\" (UniqueName: \"kubernetes.io/projected/b34428a2-5423-401a-b7d3-aebd1d070945-kube-api-access-w6w54\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505463 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-scripts\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-tls-certs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.505983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.506023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.506058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-combined-ca-bundle\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.506169 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.506208 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-config-data\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.506991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.507802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608263 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-scripts\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-tls-certs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-combined-ca-bundle\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-config-data\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34428a2-5423-401a-b7d3-aebd1d070945-logs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-secret-key\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.608498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6w54\" (UniqueName: \"kubernetes.io/projected/b34428a2-5423-401a-b7d3-aebd1d070945-kube-api-access-w6w54\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.609915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-scripts\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.610783 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34428a2-5423-401a-b7d3-aebd1d070945-logs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.611563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34428a2-5423-401a-b7d3-aebd1d070945-config-data\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.672615 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.672931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.673121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.674234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-tls-certs\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.674260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8l8\" (UniqueName: \"kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8\") pod \"horizon-7dc58bc884-khdbv\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.674627 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-horizon-secret-key\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.681971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34428a2-5423-401a-b7d3-aebd1d070945-combined-ca-bundle\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.682350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6w54\" (UniqueName: \"kubernetes.io/projected/b34428a2-5423-401a-b7d3-aebd1d070945-kube-api-access-w6w54\") pod \"horizon-77dcd5c496-hs7bj\" (UID: \"b34428a2-5423-401a-b7d3-aebd1d070945\") " pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.693315 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.770730 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.835607 4763 generic.go:334] "Generic (PLEG): container finished" podID="165a37f9-fedd-4880-b696-819f0af56cda" containerID="af09e34a4d4036881af595378a01e6869cd5e9c873a6faf16623dcd01918e2bb" exitCode=0 Dec 05 12:09:47 crc kubenswrapper[4763]: I1205 12:09:47.835677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4b6qz" event={"ID":"165a37f9-fedd-4880-b696-819f0af56cda","Type":"ContainerDied","Data":"af09e34a4d4036881af595378a01e6869cd5e9c873a6faf16623dcd01918e2bb"} Dec 05 12:09:49 crc kubenswrapper[4763]: I1205 12:09:49.856385 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:09:49 crc kubenswrapper[4763]: I1205 12:09:49.918370 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:09:49 crc kubenswrapper[4763]: I1205 12:09:49.918638 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="dnsmasq-dns" containerID="cri-o://3ff5dba6344684bbb6b452f86ef96347b491be9212296ebebccae7232c3eabfe" gracePeriod=10 Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.534071 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.552615 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vh57r" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4x9\" (UniqueName: \"kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604519 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604607 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.604641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle\") pod \"165a37f9-fedd-4880-b696-819f0af56cda\" (UID: \"165a37f9-fedd-4880-b696-819f0af56cda\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.628518 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9" (OuterVolumeSpecName: "kube-api-access-pt4x9") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "kube-api-access-pt4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.635531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.659870 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts" (OuterVolumeSpecName: "scripts") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.673878 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.695056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.710260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dbp8\" (UniqueName: \"kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8\") pod \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.710869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data\") pod \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.712573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data\") pod \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.714623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle\") pod \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\" (UID: \"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1\") " Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.716356 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.717806 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.717960 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4x9\" (UniqueName: \"kubernetes.io/projected/165a37f9-fedd-4880-b696-819f0af56cda-kube-api-access-pt4x9\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.718323 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.718718 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.721085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8" (OuterVolumeSpecName: "kube-api-access-7dbp8") pod "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" (UID: "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1"). InnerVolumeSpecName "kube-api-access-7dbp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.722013 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" (UID: "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.725686 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data" (OuterVolumeSpecName: "config-data") pod "165a37f9-fedd-4880-b696-819f0af56cda" (UID: "165a37f9-fedd-4880-b696-819f0af56cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.774701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data" (OuterVolumeSpecName: "config-data") pod "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" (UID: "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.775024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" (UID: "0af45fe0-0c3c-4394-82fb-334e1f6e7cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.820838 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.820867 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.820880 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165a37f9-fedd-4880-b696-819f0af56cda-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.820892 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dbp8\" (UniqueName: \"kubernetes.io/projected/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-kube-api-access-7dbp8\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.820903 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.866863 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vh57r" event={"ID":"0af45fe0-0c3c-4394-82fb-334e1f6e7cb1","Type":"ContainerDied","Data":"8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87"} Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.866906 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vh57r" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.866912 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8824f4371f1c6e0fb9a1d46f14e7bdda180742d33b0d81e3ade5c033c716dc87" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.869399 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4b6qz" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.869390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4b6qz" event={"ID":"165a37f9-fedd-4880-b696-819f0af56cda","Type":"ContainerDied","Data":"1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d"} Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.869598 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a09dec17d4e46bb2fb8d566192d386f07a460895f8ecac7ee8680d1d73f324d" Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.875773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerStarted","Data":"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d"} Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.878681 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerID="3ff5dba6344684bbb6b452f86ef96347b491be9212296ebebccae7232c3eabfe" exitCode=0 Dec 05 12:09:50 crc kubenswrapper[4763]: I1205 12:09:50.878797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" event={"ID":"bb42f35f-9a55-470e-b238-98c4c0a5b455","Type":"ContainerDied","Data":"3ff5dba6344684bbb6b452f86ef96347b491be9212296ebebccae7232c3eabfe"} Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.663558 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4b6qz"] Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.673095 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4b6qz"] Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.774399 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tg56v"] Dec 05 12:09:51 crc kubenswrapper[4763]: E1205 12:09:51.775123 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" containerName="glance-db-sync" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.775359 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" containerName="glance-db-sync" Dec 05 12:09:51 crc kubenswrapper[4763]: E1205 12:09:51.775489 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165a37f9-fedd-4880-b696-819f0af56cda" containerName="keystone-bootstrap" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.775568 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="165a37f9-fedd-4880-b696-819f0af56cda" containerName="keystone-bootstrap" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.775870 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" containerName="glance-db-sync" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.775962 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="165a37f9-fedd-4880-b696-819f0af56cda" containerName="keystone-bootstrap" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.776814 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.779857 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gw888" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.780858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.781220 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.781482 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.781815 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.803321 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165a37f9-fedd-4880-b696-819f0af56cda" path="/var/lib/kubelet/pods/165a37f9-fedd-4880-b696-819f0af56cda/volumes" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.804418 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tg56v"] Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6x4t\" (UniqueName: \"kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948692 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948814 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:51 crc kubenswrapper[4763]: I1205 12:09:51.948910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050593 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6x4t\" (UniqueName: \"kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050876 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.050944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.062441 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.062702 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.068399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.080708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.091631 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.092549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.091682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6x4t\" (UniqueName: \"kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t\") pod \"keystone-bootstrap-tg56v\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.101720 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.102393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.148341 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.262285 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.262348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.262415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.262495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.262585 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddpf\" (UniqueName: \"kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.364242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.364338 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.364405 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.364446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pddpf\" (UniqueName: \"kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.364505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.365476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.366484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.368117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.370219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.388543 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddpf\" (UniqueName: \"kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf\") pod \"dnsmasq-dns-56798b757f-v7nqq\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:52 crc kubenswrapper[4763]: I1205 12:09:52.486791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.110118 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.112868 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.123339 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5tbdp" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.123546 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.124563 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.124969 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.266204 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.267658 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.269791 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.283906 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqlxv\" (UniqueName: \"kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284465 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284516 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284844 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.284928 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7zm\" (UniqueName: \"kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387722 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387865 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.387954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqlxv\" (UniqueName: \"kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.388032 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.389335 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.389490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.389602 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.408532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.409182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.409462 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.413442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqlxv\" (UniqueName: \"kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.447900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.489952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490046 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7zm\" (UniqueName: \"kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.490920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.492110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.492257 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.502621 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.504366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.530266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7zm\" (UniqueName: \"kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.534279 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.582583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.585973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.662508 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.737130 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.794379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb\") pod \"bb42f35f-9a55-470e-b238-98c4c0a5b455\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.794480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc\") pod \"bb42f35f-9a55-470e-b238-98c4c0a5b455\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.794527 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config\") pod \"bb42f35f-9a55-470e-b238-98c4c0a5b455\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.794615 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb\") pod \"bb42f35f-9a55-470e-b238-98c4c0a5b455\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.794787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgshz\" (UniqueName: \"kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz\") pod \"bb42f35f-9a55-470e-b238-98c4c0a5b455\" (UID: \"bb42f35f-9a55-470e-b238-98c4c0a5b455\") " Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.812008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz" (OuterVolumeSpecName: "kube-api-access-lgshz") pod "bb42f35f-9a55-470e-b238-98c4c0a5b455" (UID: "bb42f35f-9a55-470e-b238-98c4c0a5b455"). InnerVolumeSpecName "kube-api-access-lgshz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.850687 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config" (OuterVolumeSpecName: "config") pod "bb42f35f-9a55-470e-b238-98c4c0a5b455" (UID: "bb42f35f-9a55-470e-b238-98c4c0a5b455"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.853800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb42f35f-9a55-470e-b238-98c4c0a5b455" (UID: "bb42f35f-9a55-470e-b238-98c4c0a5b455"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.855324 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb42f35f-9a55-470e-b238-98c4c0a5b455" (UID: "bb42f35f-9a55-470e-b238-98c4c0a5b455"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.858441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb42f35f-9a55-470e-b238-98c4c0a5b455" (UID: "bb42f35f-9a55-470e-b238-98c4c0a5b455"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.897074 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgshz\" (UniqueName: \"kubernetes.io/projected/bb42f35f-9a55-470e-b238-98c4c0a5b455-kube-api-access-lgshz\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.897107 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.897120 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.897718 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.898655 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb42f35f-9a55-470e-b238-98c4c0a5b455-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.915936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" event={"ID":"bb42f35f-9a55-470e-b238-98c4c0a5b455","Type":"ContainerDied","Data":"7b6f314b8912193f852e1b435f3b35b928ebf1a8a6ee2f7a14ea25288b957158"} Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.916024 4763 scope.go:117] "RemoveContainer" containerID="3ff5dba6344684bbb6b452f86ef96347b491be9212296ebebccae7232c3eabfe" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.916525 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.955678 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:09:53 crc kubenswrapper[4763]: I1205 12:09:53.966576 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2sdtf"] Dec 05 12:09:55 crc kubenswrapper[4763]: I1205 12:09:55.397838 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:09:55 crc kubenswrapper[4763]: I1205 12:09:55.492539 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:09:55 crc kubenswrapper[4763]: I1205 12:09:55.795215 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" path="/var/lib/kubelet/pods/bb42f35f-9a55-470e-b238-98c4c0a5b455/volumes" Dec 05 12:09:56 crc kubenswrapper[4763]: I1205 12:09:56.627399 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-2sdtf" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.647556 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.647815 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffhb5h5bdh558h66fh584h5dh655h598h6dh555h7fh566h4h56dh99h6chb6h698h67hffh59h58dh596hd5h678h555h645hb5h8ch674h5f8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djgh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-774db4cc99-rvp4p_openstack(e8979c4b-8aad-4db5-ae70-b695188b6079): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.650658 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-774db4cc99-rvp4p" podUID="e8979c4b-8aad-4db5-ae70-b695188b6079" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.663620 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.664011 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch649h548h664h5c4hb9hbfh595h9h646h578h5b4h567h5c5h574h99h57fhcch559h5fch56bh649h577h678h554hf6h644h559h674h96h675hdbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrqwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-646fbb8979-kfn2p_openstack(52fb7404-f381-4887-aad1-d761e1997b6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:09:58 crc kubenswrapper[4763]: E1205 12:09:58.666999 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-646fbb8979-kfn2p" podUID="52fb7404-f381-4887-aad1-d761e1997b6a" Dec 05 12:10:08 crc kubenswrapper[4763]: E1205 12:10:08.544782 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 12:10:08 crc kubenswrapper[4763]: E1205 12:10:08.545378 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596h87h577h58h59bh6dh5dbh58fh56h598hd4hb7h59fh684h66bh678hd8hb7h557h665h98hb7h5dfh9fh5chb7hdch67dh5d9h6fh5hbdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlvgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5d6ccfb9d9-bqng6_openstack(a5583b19-1a4c-4b4f-9147-fecc8cd733a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:10:08 crc kubenswrapper[4763]: E1205 12:10:08.549427 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5d6ccfb9d9-bqng6" podUID="a5583b19-1a4c-4b4f-9147-fecc8cd733a1" Dec 05 12:10:08 crc kubenswrapper[4763]: I1205 12:10:08.954365 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:10:09 crc kubenswrapper[4763]: E1205 12:10:09.008593 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 05 12:10:09 crc kubenswrapper[4763]: E1205 12:10:09.008788 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596h679h89hdch547h88h8dh64ch58h545h5b6h67fh58h5d4h5dfh585h94h688h5fdh9ch669h55fh5c8h5fch68ch674h648h6ch64ch577h59h547q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8j52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bb8f26ac-3463-4d42-936d-420cfdbd81eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.057274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-646fbb8979-kfn2p" event={"ID":"52fb7404-f381-4887-aad1-d761e1997b6a","Type":"ContainerDied","Data":"9708a8247ab8dc830783e1f66379cebb023642cbffe9b1ba249f755c7bbe7c65"} Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.057321 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9708a8247ab8dc830783e1f66379cebb023642cbffe9b1ba249f755c7bbe7c65" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.059975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-774db4cc99-rvp4p" event={"ID":"e8979c4b-8aad-4db5-ae70-b695188b6079","Type":"ContainerDied","Data":"c61619766c1c417ca092bb3eba83f28db510c9277b08109b0eb8a202e65534f7"} Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.060006 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61619766c1c417ca092bb3eba83f28db510c9277b08109b0eb8a202e65534f7" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.123244 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.133026 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.254355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts\") pod \"52fb7404-f381-4887-aad1-d761e1997b6a\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.254485 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs\") pod \"52fb7404-f381-4887-aad1-d761e1997b6a\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.254580 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djgh6\" (UniqueName: \"kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6\") pod \"e8979c4b-8aad-4db5-ae70-b695188b6079\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.254903 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs" (OuterVolumeSpecName: "logs") pod "52fb7404-f381-4887-aad1-d761e1997b6a" (UID: "52fb7404-f381-4887-aad1-d761e1997b6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts" (OuterVolumeSpecName: "scripts") pod "52fb7404-f381-4887-aad1-d761e1997b6a" (UID: "52fb7404-f381-4887-aad1-d761e1997b6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data" (OuterVolumeSpecName: "config-data") pod "52fb7404-f381-4887-aad1-d761e1997b6a" (UID: "52fb7404-f381-4887-aad1-d761e1997b6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.254611 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data\") pod \"52fb7404-f381-4887-aad1-d761e1997b6a\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqwx\" (UniqueName: \"kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx\") pod \"52fb7404-f381-4887-aad1-d761e1997b6a\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255649 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key\") pod \"52fb7404-f381-4887-aad1-d761e1997b6a\" (UID: \"52fb7404-f381-4887-aad1-d761e1997b6a\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255687 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts\") pod \"e8979c4b-8aad-4db5-ae70-b695188b6079\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data\") pod \"e8979c4b-8aad-4db5-ae70-b695188b6079\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255817 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs\") pod \"e8979c4b-8aad-4db5-ae70-b695188b6079\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.255861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key\") pod \"e8979c4b-8aad-4db5-ae70-b695188b6079\" (UID: \"e8979c4b-8aad-4db5-ae70-b695188b6079\") " Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256172 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs" (OuterVolumeSpecName: "logs") pod "e8979c4b-8aad-4db5-ae70-b695188b6079" (UID: "e8979c4b-8aad-4db5-ae70-b695188b6079"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256321 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts" (OuterVolumeSpecName: "scripts") pod "e8979c4b-8aad-4db5-ae70-b695188b6079" (UID: "e8979c4b-8aad-4db5-ae70-b695188b6079"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256434 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data" (OuterVolumeSpecName: "config-data") pod "e8979c4b-8aad-4db5-ae70-b695188b6079" (UID: "e8979c4b-8aad-4db5-ae70-b695188b6079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256490 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256507 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8979c4b-8aad-4db5-ae70-b695188b6079-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256517 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256531 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fb7404-f381-4887-aad1-d761e1997b6a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.256541 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52fb7404-f381-4887-aad1-d761e1997b6a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.260219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx" (OuterVolumeSpecName: "kube-api-access-qrqwx") pod "52fb7404-f381-4887-aad1-d761e1997b6a" (UID: "52fb7404-f381-4887-aad1-d761e1997b6a"). InnerVolumeSpecName "kube-api-access-qrqwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.260911 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e8979c4b-8aad-4db5-ae70-b695188b6079" (UID: "e8979c4b-8aad-4db5-ae70-b695188b6079"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.261248 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6" (OuterVolumeSpecName: "kube-api-access-djgh6") pod "e8979c4b-8aad-4db5-ae70-b695188b6079" (UID: "e8979c4b-8aad-4db5-ae70-b695188b6079"). InnerVolumeSpecName "kube-api-access-djgh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.261880 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "52fb7404-f381-4887-aad1-d761e1997b6a" (UID: "52fb7404-f381-4887-aad1-d761e1997b6a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.357975 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djgh6\" (UniqueName: \"kubernetes.io/projected/e8979c4b-8aad-4db5-ae70-b695188b6079-kube-api-access-djgh6\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.358023 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqwx\" (UniqueName: \"kubernetes.io/projected/52fb7404-f381-4887-aad1-d761e1997b6a-kube-api-access-qrqwx\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.358037 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/52fb7404-f381-4887-aad1-d761e1997b6a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.358050 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8979c4b-8aad-4db5-ae70-b695188b6079-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.358062 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8979c4b-8aad-4db5-ae70-b695188b6079-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:09 crc kubenswrapper[4763]: I1205 12:10:09.464323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77dcd5c496-hs7bj"] Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.066511 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774db4cc99-rvp4p" Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.066576 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-646fbb8979-kfn2p" Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.112464 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.122258 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-646fbb8979-kfn2p"] Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.145014 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.151609 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-774db4cc99-rvp4p"] Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.465173 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.465420 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-492vf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-cprq9_openstack(10d49525-ec1b-4c52-8221-f3f0bb57e574): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.466604 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-cprq9" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.848880 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.849033 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgsw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wgf85_openstack(1f3d2c51-7840-4854-af9e-e0da6c484074): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:10:10 crc kubenswrapper[4763]: E1205 12:10:10.850210 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wgf85" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" Dec 05 12:10:10 crc kubenswrapper[4763]: W1205 12:10:10.864982 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ba0cbf0_3e4e_4cb0_82b0_179d11937330.slice/crio-6d3054201aea8506c8e8e9a592fcffa7d9479ebf438cfafd9f83299a7f88a265 WatchSource:0}: Error finding container 6d3054201aea8506c8e8e9a592fcffa7d9479ebf438cfafd9f83299a7f88a265: Status 404 returned error can't find the container with id 6d3054201aea8506c8e8e9a592fcffa7d9479ebf438cfafd9f83299a7f88a265 Dec 05 12:10:10 crc kubenswrapper[4763]: W1205 12:10:10.881544 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34428a2_5423_401a_b7d3_aebd1d070945.slice/crio-24d5f9734d3a942655106ebb93d955a3bbf6247184cef6b7041e61793ae40b9b WatchSource:0}: Error finding container 24d5f9734d3a942655106ebb93d955a3bbf6247184cef6b7041e61793ae40b9b: Status 404 returned error can't find the container with id 24d5f9734d3a942655106ebb93d955a3bbf6247184cef6b7041e61793ae40b9b Dec 05 12:10:10 crc kubenswrapper[4763]: I1205 12:10:10.908041 4763 scope.go:117] "RemoveContainer" containerID="a89a166661ebf000a1a2380745010fa8ebff12ca3d85a6997b358d8c69c7b882" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.065062 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.076057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d6ccfb9d9-bqng6" event={"ID":"a5583b19-1a4c-4b4f-9147-fecc8cd733a1","Type":"ContainerDied","Data":"f3020dcf02a9eab2791cdf74d40c301fb1a8d08e32c5621708f1238cbf7382d9"} Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.076127 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6ccfb9d9-bqng6" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.077369 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerStarted","Data":"6d3054201aea8506c8e8e9a592fcffa7d9479ebf438cfafd9f83299a7f88a265"} Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.078676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcd5c496-hs7bj" event={"ID":"b34428a2-5423-401a-b7d3-aebd1d070945","Type":"ContainerStarted","Data":"24d5f9734d3a942655106ebb93d955a3bbf6247184cef6b7041e61793ae40b9b"} Dec 05 12:10:11 crc kubenswrapper[4763]: E1205 12:10:11.089740 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-cprq9" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" Dec 05 12:10:11 crc kubenswrapper[4763]: E1205 12:10:11.096568 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wgf85" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.195820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlvgz\" (UniqueName: \"kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz\") pod \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.195922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts\") pod \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.195949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data\") pod \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.196000 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs\") pod \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.196056 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key\") pod \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\" (UID: \"a5583b19-1a4c-4b4f-9147-fecc8cd733a1\") " Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.196648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs" (OuterVolumeSpecName: "logs") pod "a5583b19-1a4c-4b4f-9147-fecc8cd733a1" (UID: "a5583b19-1a4c-4b4f-9147-fecc8cd733a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.200298 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts" (OuterVolumeSpecName: "scripts") pod "a5583b19-1a4c-4b4f-9147-fecc8cd733a1" (UID: "a5583b19-1a4c-4b4f-9147-fecc8cd733a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.202110 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data" (OuterVolumeSpecName: "config-data") pod "a5583b19-1a4c-4b4f-9147-fecc8cd733a1" (UID: "a5583b19-1a4c-4b4f-9147-fecc8cd733a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.206568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a5583b19-1a4c-4b4f-9147-fecc8cd733a1" (UID: "a5583b19-1a4c-4b4f-9147-fecc8cd733a1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.207260 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz" (OuterVolumeSpecName: "kube-api-access-qlvgz") pod "a5583b19-1a4c-4b4f-9147-fecc8cd733a1" (UID: "a5583b19-1a4c-4b4f-9147-fecc8cd733a1"). InnerVolumeSpecName "kube-api-access-qlvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.297910 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlvgz\" (UniqueName: \"kubernetes.io/projected/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-kube-api-access-qlvgz\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.297938 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.297947 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.297958 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.297976 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5583b19-1a4c-4b4f-9147-fecc8cd733a1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.453173 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.462025 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d6ccfb9d9-bqng6"] Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.524386 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tg56v"] Dec 05 12:10:11 crc kubenswrapper[4763]: W1205 12:10:11.562034 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d08538d_45d6_4f05_81a6_60ecc26dc593.slice/crio-3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844 WatchSource:0}: Error finding container 3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844: Status 404 returned error can't find the container with id 3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844 Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.670850 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.680204 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.766798 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.799343 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52fb7404-f381-4887-aad1-d761e1997b6a" path="/var/lib/kubelet/pods/52fb7404-f381-4887-aad1-d761e1997b6a/volumes" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.799983 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5583b19-1a4c-4b4f-9147-fecc8cd733a1" path="/var/lib/kubelet/pods/a5583b19-1a4c-4b4f-9147-fecc8cd733a1/volumes" Dec 05 12:10:11 crc kubenswrapper[4763]: I1205 12:10:11.800558 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8979c4b-8aad-4db5-ae70-b695188b6079" path="/var/lib/kubelet/pods/e8979c4b-8aad-4db5-ae70-b695188b6079/volumes" Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.091694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerStarted","Data":"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.093704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcd5c496-hs7bj" event={"ID":"b34428a2-5423-401a-b7d3-aebd1d070945","Type":"ContainerStarted","Data":"6882d4f707d5ab318e03285a18d0af6b9be4daf6491640fcf2c7409502132981"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.096333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"bd06ed85785e799c66f01508276860b9433026bb06b6d056f7980a487857647d"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.096374 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"a970dc15179ae99b65f9300a6e92423dd4c05c0c453f761720010b0086c3ecf6"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.097566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tg56v" event={"ID":"5d08538d-45d6-4f05-81a6-60ecc26dc593","Type":"ContainerStarted","Data":"3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.099949 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nvqn2" event={"ID":"b2bf0108-5266-4f52-8803-39a842ddc777","Type":"ContainerStarted","Data":"abbd6d796f26bf34ba2f4470ded8d6e07e7ff1c8a25b2cb78e5219f8ae665d17"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.102555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wcz88" event={"ID":"4046982d-ad27-468f-897a-167692d9ae49","Type":"ContainerStarted","Data":"09c82ff3921155c0ca1d533fd930c760c4a5164fa414656cdc21fd21d92026dd"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.104340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerStarted","Data":"0f9fce0cd10290e14fb000bbba92b5ee8865c914d6ce43d7be912c918930bcb6"} Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.121114 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=39.121095585 podStartE2EDuration="39.121095585s" podCreationTimestamp="2025-12-05 12:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:12.115337604 +0000 UTC m=+1296.608052337" watchObservedRunningTime="2025-12-05 12:10:12.121095585 +0000 UTC m=+1296.613810298" Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.136554 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-wcz88" podStartSLOduration=7.007914119 podStartE2EDuration="1m3.136537289s" podCreationTimestamp="2025-12-05 12:09:09 +0000 UTC" firstStartedPulling="2025-12-05 12:09:14.846486171 +0000 UTC m=+1239.339200894" lastFinishedPulling="2025-12-05 12:10:10.975109331 +0000 UTC m=+1295.467824064" observedRunningTime="2025-12-05 12:10:12.136104813 +0000 UTC m=+1296.628819536" watchObservedRunningTime="2025-12-05 12:10:12.136537289 +0000 UTC m=+1296.629252012" Dec 05 12:10:12 crc kubenswrapper[4763]: I1205 12:10:12.170732 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nvqn2" podStartSLOduration=3.125321187 podStartE2EDuration="33.170714064s" podCreationTimestamp="2025-12-05 12:09:39 +0000 UTC" firstStartedPulling="2025-12-05 12:09:40.845865699 +0000 UTC m=+1265.338580422" lastFinishedPulling="2025-12-05 12:10:10.891258576 +0000 UTC m=+1295.383973299" observedRunningTime="2025-12-05 12:10:12.170339599 +0000 UTC m=+1296.663054322" watchObservedRunningTime="2025-12-05 12:10:12.170714064 +0000 UTC m=+1296.663428787" Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.116133 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerStarted","Data":"db7f9809d3f9be8f6f6d0b3eccb2f6c7b682e2f5b570dcc420bac86ce64e99f5"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.121215 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerID="c4ebec2fb94c72e50f161f3b8221f0a8ddd831d2e0ad785f50e520df04a3bbd8" exitCode=0 Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.121289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" event={"ID":"7e2b756b-c7db-44c5-97d8-b906e3d01e21","Type":"ContainerDied","Data":"c4ebec2fb94c72e50f161f3b8221f0a8ddd831d2e0ad785f50e520df04a3bbd8"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.121318 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" event={"ID":"7e2b756b-c7db-44c5-97d8-b906e3d01e21","Type":"ContainerStarted","Data":"32e6e800d6f7a7225c3c69b20050d479ac747cfff86bd1e1397eb9fadc6a5f7b"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.125190 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcd5c496-hs7bj" event={"ID":"b34428a2-5423-401a-b7d3-aebd1d070945","Type":"ContainerStarted","Data":"43137bdb51da7dfb1d36cc42e42ce5652e74f99c43f1939f0c563232b8ba5746"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.127656 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerStarted","Data":"b95dadcd780efc2c26a68e211c2c22762742bef84110443e03a0fa1ec37f4d81"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.127692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerStarted","Data":"4cd275260f3988c76de37251927b02e2328630fb7ed5f3e0164985109cc94069"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.129500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerStarted","Data":"93690a3d1d0f8a59fae3e09fb8e16402baa56f64e6ece52b0e4d0777619c2759"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.129546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerStarted","Data":"dd1c40dab7f9d51de14c9f235ba6e4fcf020586a788d7d52c06155606cb66f09"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.132298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tg56v" event={"ID":"5d08538d-45d6-4f05-81a6-60ecc26dc593","Type":"ContainerStarted","Data":"ad8e2eed26b275208329fff0ef5c26c0ab920119e69cc3f7221aa6ac9f83c297"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.141005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerStarted","Data":"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.145551 4763 generic.go:334] "Generic (PLEG): container finished" podID="274fe292-e3f0-432c-9947-3bca5514f6d9" containerID="7b00072b9f7e2f54d96cbe80dd6024fdf58ccdadb25eedc84fe24bc238c8501e" exitCode=0 Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.146664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x6zwz" event={"ID":"274fe292-e3f0-432c-9947-3bca5514f6d9","Type":"ContainerDied","Data":"7b00072b9f7e2f54d96cbe80dd6024fdf58ccdadb25eedc84fe24bc238c8501e"} Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.148246 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7dc58bc884-khdbv" podStartSLOduration=25.684315547 podStartE2EDuration="26.148230849s" podCreationTimestamp="2025-12-05 12:09:47 +0000 UTC" firstStartedPulling="2025-12-05 12:10:10.869551351 +0000 UTC m=+1295.362266074" lastFinishedPulling="2025-12-05 12:10:11.333466653 +0000 UTC m=+1295.826181376" observedRunningTime="2025-12-05 12:10:13.136187816 +0000 UTC m=+1297.628902529" watchObservedRunningTime="2025-12-05 12:10:13.148230849 +0000 UTC m=+1297.640945582" Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.197592 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77dcd5c496-hs7bj" podStartSLOduration=25.740915503 podStartE2EDuration="26.197572126s" podCreationTimestamp="2025-12-05 12:09:47 +0000 UTC" firstStartedPulling="2025-12-05 12:10:10.884150913 +0000 UTC m=+1295.376865636" lastFinishedPulling="2025-12-05 12:10:11.340807536 +0000 UTC m=+1295.833522259" observedRunningTime="2025-12-05 12:10:13.19663475 +0000 UTC m=+1297.689349473" watchObservedRunningTime="2025-12-05 12:10:13.197572126 +0000 UTC m=+1297.690286849" Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.307457 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tg56v" podStartSLOduration=22.307430511 podStartE2EDuration="22.307430511s" podCreationTimestamp="2025-12-05 12:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:13.222288866 +0000 UTC m=+1297.715003589" watchObservedRunningTime="2025-12-05 12:10:13.307430511 +0000 UTC m=+1297.800145244" Dec 05 12:10:13 crc kubenswrapper[4763]: I1205 12:10:13.817958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.164986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"b9d69c57b36ac4625f0e0407df4390ae622a1ce6a01d858b5c695e6100e42862"} Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.165035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"b9760952347d70164c4c897258084d9520fc0664cc9b140076b5ff7eeaad906f"} Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.167734 4763 generic.go:334] "Generic (PLEG): container finished" podID="b2bf0108-5266-4f52-8803-39a842ddc777" containerID="abbd6d796f26bf34ba2f4470ded8d6e07e7ff1c8a25b2cb78e5219f8ae665d17" exitCode=0 Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.167850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nvqn2" event={"ID":"b2bf0108-5266-4f52-8803-39a842ddc777","Type":"ContainerDied","Data":"abbd6d796f26bf34ba2f4470ded8d6e07e7ff1c8a25b2cb78e5219f8ae665d17"} Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.176908 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" event={"ID":"7e2b756b-c7db-44c5-97d8-b906e3d01e21","Type":"ContainerStarted","Data":"36e17e345b14156ede7490ff7c0e3b31ae3f63ba6b7da412b0865c94b4a7a620"} Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.176963 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.236497 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" podStartSLOduration=22.236475282 podStartE2EDuration="22.236475282s" podCreationTimestamp="2025-12-05 12:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:14.222474184 +0000 UTC m=+1298.715188907" watchObservedRunningTime="2025-12-05 12:10:14.236475282 +0000 UTC m=+1298.729190005" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.673633 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.771659 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86jq4\" (UniqueName: \"kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4\") pod \"274fe292-e3f0-432c-9947-3bca5514f6d9\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.771781 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") pod \"274fe292-e3f0-432c-9947-3bca5514f6d9\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.771938 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle\") pod \"274fe292-e3f0-432c-9947-3bca5514f6d9\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.784192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4" (OuterVolumeSpecName: "kube-api-access-86jq4") pod "274fe292-e3f0-432c-9947-3bca5514f6d9" (UID: "274fe292-e3f0-432c-9947-3bca5514f6d9"). InnerVolumeSpecName "kube-api-access-86jq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:14 crc kubenswrapper[4763]: E1205 12:10:14.800066 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config podName:274fe292-e3f0-432c-9947-3bca5514f6d9 nodeName:}" failed. No retries permitted until 2025-12-05 12:10:15.300035946 +0000 UTC m=+1299.792750679 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config") pod "274fe292-e3f0-432c-9947-3bca5514f6d9" (UID: "274fe292-e3f0-432c-9947-3bca5514f6d9") : error deleting /var/lib/kubelet/pods/274fe292-e3f0-432c-9947-3bca5514f6d9/volume-subpaths: remove /var/lib/kubelet/pods/274fe292-e3f0-432c-9947-3bca5514f6d9/volume-subpaths: no such file or directory Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.802909 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "274fe292-e3f0-432c-9947-3bca5514f6d9" (UID: "274fe292-e3f0-432c-9947-3bca5514f6d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.876378 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:14 crc kubenswrapper[4763]: I1205 12:10:14.876425 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86jq4\" (UniqueName: \"kubernetes.io/projected/274fe292-e3f0-432c-9947-3bca5514f6d9-kube-api-access-86jq4\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.210183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerStarted","Data":"24fff5cc2fa368a5eacec4de1ed1e7c9ee48442324cbe5cd1a239a1ce3df48b2"} Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.217442 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"30a44297fdcd21e6c11dc08b4822174d60df6f03c1fc5ab0a218ca5a465015dd"} Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.220952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x6zwz" event={"ID":"274fe292-e3f0-432c-9947-3bca5514f6d9","Type":"ContainerDied","Data":"ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020"} Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.220981 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec890f219fd28c2eab4b66082eab976d4beead847cb4e84871fdef0a775fd020" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.221303 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x6zwz" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.400230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") pod \"274fe292-e3f0-432c-9947-3bca5514f6d9\" (UID: \"274fe292-e3f0-432c-9947-3bca5514f6d9\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.422204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config" (OuterVolumeSpecName: "config") pod "274fe292-e3f0-432c-9947-3bca5514f6d9" (UID: "274fe292-e3f0-432c-9947-3bca5514f6d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.467866 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.510623 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/274fe292-e3f0-432c-9947-3bca5514f6d9-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.534303 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:10:15 crc kubenswrapper[4763]: E1205 12:10:15.534805 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="init" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.534821 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="init" Dec 05 12:10:15 crc kubenswrapper[4763]: E1205 12:10:15.534837 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274fe292-e3f0-432c-9947-3bca5514f6d9" containerName="neutron-db-sync" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.534854 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="274fe292-e3f0-432c-9947-3bca5514f6d9" containerName="neutron-db-sync" Dec 05 12:10:15 crc kubenswrapper[4763]: E1205 12:10:15.534871 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="dnsmasq-dns" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.534879 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="dnsmasq-dns" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.535094 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb42f35f-9a55-470e-b238-98c4c0a5b455" containerName="dnsmasq-dns" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.535107 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="274fe292-e3f0-432c-9947-3bca5514f6d9" containerName="neutron-db-sync" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.536191 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.595885 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.612004 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.612109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.612139 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mml2z\" (UniqueName: \"kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.612223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.612280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.625847 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.627558 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.634153 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.634503 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.634694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-drpkw" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.635635 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.635783 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.713938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714113 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mml2z\" (UniqueName: \"kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zklxb\" (UniqueName: \"kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.714313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.715317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.716046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.716671 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.717661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.746874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mml2z\" (UniqueName: \"kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z\") pod \"dnsmasq-dns-b6c948c7-7tdkh\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.816874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.817029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.817057 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.817102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zklxb\" (UniqueName: \"kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.817142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.825403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.828850 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.830682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.831467 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nvqn2" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.835900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.838721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zklxb\" (UniqueName: \"kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb\") pod \"neutron-dc5f79d94-t8x4q\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.890931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.920372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data\") pod \"b2bf0108-5266-4f52-8803-39a842ddc777\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.920497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7t8z\" (UniqueName: \"kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z\") pod \"b2bf0108-5266-4f52-8803-39a842ddc777\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.920548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle\") pod \"b2bf0108-5266-4f52-8803-39a842ddc777\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.920581 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts\") pod \"b2bf0108-5266-4f52-8803-39a842ddc777\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.920689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs\") pod \"b2bf0108-5266-4f52-8803-39a842ddc777\" (UID: \"b2bf0108-5266-4f52-8803-39a842ddc777\") " Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.922053 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs" (OuterVolumeSpecName: "logs") pod "b2bf0108-5266-4f52-8803-39a842ddc777" (UID: "b2bf0108-5266-4f52-8803-39a842ddc777"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.924449 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z" (OuterVolumeSpecName: "kube-api-access-v7t8z") pod "b2bf0108-5266-4f52-8803-39a842ddc777" (UID: "b2bf0108-5266-4f52-8803-39a842ddc777"). InnerVolumeSpecName "kube-api-access-v7t8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.956787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.958991 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts" (OuterVolumeSpecName: "scripts") pod "b2bf0108-5266-4f52-8803-39a842ddc777" (UID: "b2bf0108-5266-4f52-8803-39a842ddc777"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:15 crc kubenswrapper[4763]: I1205 12:10:15.959048 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2bf0108-5266-4f52-8803-39a842ddc777" (UID: "b2bf0108-5266-4f52-8803-39a842ddc777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.023705 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7t8z\" (UniqueName: \"kubernetes.io/projected/b2bf0108-5266-4f52-8803-39a842ddc777-kube-api-access-v7t8z\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.023758 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.023787 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.023799 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2bf0108-5266-4f52-8803-39a842ddc777-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.024355 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data" (OuterVolumeSpecName: "config-data") pod "b2bf0108-5266-4f52-8803-39a842ddc777" (UID: "b2bf0108-5266-4f52-8803-39a842ddc777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.125444 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2bf0108-5266-4f52-8803-39a842ddc777-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.244266 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nvqn2" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.244284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nvqn2" event={"ID":"b2bf0108-5266-4f52-8803-39a842ddc777","Type":"ContainerDied","Data":"cc76d5fdfc04a366476490a65c03203688e95857b0ead2447fa481241e7d429c"} Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.244374 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc76d5fdfc04a366476490a65c03203688e95857b0ead2447fa481241e7d429c" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.244508 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="dnsmasq-dns" containerID="cri-o://36e17e345b14156ede7490ff7c0e3b31ae3f63ba6b7da412b0865c94b4a7a620" gracePeriod=10 Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.376456 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76d66464d-r24j6"] Dec 05 12:10:16 crc kubenswrapper[4763]: E1205 12:10:16.376954 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bf0108-5266-4f52-8803-39a842ddc777" containerName="placement-db-sync" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.376980 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bf0108-5266-4f52-8803-39a842ddc777" containerName="placement-db-sync" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.377201 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bf0108-5266-4f52-8803-39a842ddc777" containerName="placement-db-sync" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.378376 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.380245 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.380467 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.380639 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.380944 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-55jfq" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.391212 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.394150 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76d66464d-r24j6"] Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-internal-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430824 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-config-data\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-scripts\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430893 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652xl\" (UniqueName: \"kubernetes.io/projected/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-kube-api-access-652xl\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-logs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-combined-ca-bundle\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.430997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-public-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-scripts\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532263 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652xl\" (UniqueName: \"kubernetes.io/projected/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-kube-api-access-652xl\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-logs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-combined-ca-bundle\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-public-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-internal-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.532981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-logs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.534838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-config-data\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.537379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-scripts\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.540563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-config-data\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.541258 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-combined-ca-bundle\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.543424 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-internal-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.544066 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-public-tls-certs\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.557582 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652xl\" (UniqueName: \"kubernetes.io/projected/7e5e41f7-ee3c-4587-bae0-5716c12c84b6-kube-api-access-652xl\") pod \"placement-76d66464d-r24j6\" (UID: \"7e5e41f7-ee3c-4587-bae0-5716c12c84b6\") " pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:16 crc kubenswrapper[4763]: I1205 12:10:16.704543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.267699 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerID="36e17e345b14156ede7490ff7c0e3b31ae3f63ba6b7da412b0865c94b4a7a620" exitCode=0 Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.268075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" event={"ID":"7e2b756b-c7db-44c5-97d8-b906e3d01e21","Type":"ContainerDied","Data":"36e17e345b14156ede7490ff7c0e3b31ae3f63ba6b7da412b0865c94b4a7a620"} Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.271522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerStarted","Data":"d3ef703290f159546db5f4c50ab7c1a241f7d3080bd43947e7c9fef693358b35"} Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.586379 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b55c974d9-brgnw"] Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.606379 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b55c974d9-brgnw"] Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.606540 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.610597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.621614 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-public-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655746 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp4cg\" (UniqueName: \"kubernetes.io/projected/6d6c980e-688d-41b3-a7ad-0061b07b9494-kube-api-access-gp4cg\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-internal-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-combined-ca-bundle\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-httpd-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655891 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.655961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-ovndb-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.693510 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.693606 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-ovndb-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757747 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-public-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp4cg\" (UniqueName: \"kubernetes.io/projected/6d6c980e-688d-41b3-a7ad-0061b07b9494-kube-api-access-gp4cg\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-internal-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-combined-ca-bundle\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.757950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-httpd-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.758001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.763379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-httpd-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.763502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-public-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.764043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-config\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.764723 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-combined-ca-bundle\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.766272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-ovndb-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.771455 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.772392 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.777855 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6c980e-688d-41b3-a7ad-0061b07b9494-internal-tls-certs\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.778449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp4cg\" (UniqueName: \"kubernetes.io/projected/6d6c980e-688d-41b3-a7ad-0061b07b9494-kube-api-access-gp4cg\") pod \"neutron-b55c974d9-brgnw\" (UID: \"6d6c980e-688d-41b3-a7ad-0061b07b9494\") " pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:17 crc kubenswrapper[4763]: I1205 12:10:17.946388 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:18 crc kubenswrapper[4763]: I1205 12:10:18.788904 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 12:10:18 crc kubenswrapper[4763]: I1205 12:10:18.794589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 12:10:19 crc kubenswrapper[4763]: I1205 12:10:19.317884 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.319081 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-httpd" containerID="cri-o://24fff5cc2fa368a5eacec4de1ed1e7c9ee48442324cbe5cd1a239a1ce3df48b2" gracePeriod=30 Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.319106 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-log" containerID="cri-o://b95dadcd780efc2c26a68e211c2c22762742bef84110443e03a0fa1ec37f4d81" gracePeriod=30 Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.346801 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=28.346781892 podStartE2EDuration="28.346781892s" podCreationTimestamp="2025-12-05 12:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:20.336574181 +0000 UTC m=+1304.829288904" watchObservedRunningTime="2025-12-05 12:10:20.346781892 +0000 UTC m=+1304.839496615" Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.510203 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.679450 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.872851 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76d66464d-r24j6"] Dec 05 12:10:20 crc kubenswrapper[4763]: I1205 12:10:20.900201 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b55c974d9-brgnw"] Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.328196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" event={"ID":"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2","Type":"ContainerStarted","Data":"45c8940fcd5d1101e4828a4aae4b850cdf67eab7999ef457e7171fc8a020af40"} Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.332570 4763 generic.go:334] "Generic (PLEG): container finished" podID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerID="24fff5cc2fa368a5eacec4de1ed1e7c9ee48442324cbe5cd1a239a1ce3df48b2" exitCode=0 Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.332599 4763 generic.go:334] "Generic (PLEG): container finished" podID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerID="b95dadcd780efc2c26a68e211c2c22762742bef84110443e03a0fa1ec37f4d81" exitCode=143 Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.332643 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerDied","Data":"24fff5cc2fa368a5eacec4de1ed1e7c9ee48442324cbe5cd1a239a1ce3df48b2"} Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.332665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerDied","Data":"b95dadcd780efc2c26a68e211c2c22762742bef84110443e03a0fa1ec37f4d81"} Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.336898 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"d780b8b44227163e442eeb9d57f8264fdb4fa2dc561c298705507315f57e80f8"} Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.338224 4763 generic.go:334] "Generic (PLEG): container finished" podID="5d08538d-45d6-4f05-81a6-60ecc26dc593" containerID="ad8e2eed26b275208329fff0ef5c26c0ab920119e69cc3f7221aa6ac9f83c297" exitCode=0 Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.338396 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-log" containerID="cri-o://93690a3d1d0f8a59fae3e09fb8e16402baa56f64e6ece52b0e4d0777619c2759" gracePeriod=30 Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.338520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tg56v" event={"ID":"5d08538d-45d6-4f05-81a6-60ecc26dc593","Type":"ContainerDied","Data":"ad8e2eed26b275208329fff0ef5c26c0ab920119e69cc3f7221aa6ac9f83c297"} Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.338823 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-httpd" containerID="cri-o://d3ef703290f159546db5f4c50ab7c1a241f7d3080bd43947e7c9fef693358b35" gracePeriod=30 Dec 05 12:10:21 crc kubenswrapper[4763]: I1205 12:10:21.377460 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.377434861 podStartE2EDuration="29.377434861s" podCreationTimestamp="2025-12-05 12:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:21.370474074 +0000 UTC m=+1305.863188807" watchObservedRunningTime="2025-12-05 12:10:21.377434861 +0000 UTC m=+1305.870149604" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.256120 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.359603 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" event={"ID":"7e2b756b-c7db-44c5-97d8-b906e3d01e21","Type":"ContainerDied","Data":"32e6e800d6f7a7225c3c69b20050d479ac747cfff86bd1e1397eb9fadc6a5f7b"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.359651 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-v7nqq" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.359674 4763 scope.go:117] "RemoveContainer" containerID="36e17e345b14156ede7490ff7c0e3b31ae3f63ba6b7da412b0865c94b4a7a620" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.360321 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config\") pod \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.360372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb\") pod \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.361694 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb\") pod \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.361779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddpf\" (UniqueName: \"kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf\") pod \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.365887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc\") pod \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\" (UID: \"7e2b756b-c7db-44c5-97d8-b906e3d01e21\") " Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.381923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf" (OuterVolumeSpecName: "kube-api-access-pddpf") pod "7e2b756b-c7db-44c5-97d8-b906e3d01e21" (UID: "7e2b756b-c7db-44c5-97d8-b906e3d01e21"). InnerVolumeSpecName "kube-api-access-pddpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.385681 4763 generic.go:334] "Generic (PLEG): container finished" podID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerID="d3ef703290f159546db5f4c50ab7c1a241f7d3080bd43947e7c9fef693358b35" exitCode=0 Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.385715 4763 generic.go:334] "Generic (PLEG): container finished" podID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerID="93690a3d1d0f8a59fae3e09fb8e16402baa56f64e6ece52b0e4d0777619c2759" exitCode=143 Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.385771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerDied","Data":"d3ef703290f159546db5f4c50ab7c1a241f7d3080bd43947e7c9fef693358b35"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.385797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerDied","Data":"93690a3d1d0f8a59fae3e09fb8e16402baa56f64e6ece52b0e4d0777619c2759"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.388634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b55c974d9-brgnw" event={"ID":"6d6c980e-688d-41b3-a7ad-0061b07b9494","Type":"ContainerStarted","Data":"7c9d644ea1f79097c1c408fe3b54548480f48a117b432ab1b8ddc7b5edf55c79"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.393221 4763 generic.go:334] "Generic (PLEG): container finished" podID="4046982d-ad27-468f-897a-167692d9ae49" containerID="09c82ff3921155c0ca1d533fd930c760c4a5164fa414656cdc21fd21d92026dd" exitCode=0 Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.393304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wcz88" event={"ID":"4046982d-ad27-468f-897a-167692d9ae49","Type":"ContainerDied","Data":"09c82ff3921155c0ca1d533fd930c760c4a5164fa414656cdc21fd21d92026dd"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.400728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerStarted","Data":"7fcd4abc441e2d440af48727aa27282f7a556dcfa12259cafce378f50de07249"} Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.450531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config" (OuterVolumeSpecName: "config") pod "7e2b756b-c7db-44c5-97d8-b906e3d01e21" (UID: "7e2b756b-c7db-44c5-97d8-b906e3d01e21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.450958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e2b756b-c7db-44c5-97d8-b906e3d01e21" (UID: "7e2b756b-c7db-44c5-97d8-b906e3d01e21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.454379 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e2b756b-c7db-44c5-97d8-b906e3d01e21" (UID: "7e2b756b-c7db-44c5-97d8-b906e3d01e21"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.468575 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.468615 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.468627 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pddpf\" (UniqueName: \"kubernetes.io/projected/7e2b756b-c7db-44c5-97d8-b906e3d01e21-kube-api-access-pddpf\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.468637 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.475709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e2b756b-c7db-44c5-97d8-b906e3d01e21" (UID: "7e2b756b-c7db-44c5-97d8-b906e3d01e21"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.569893 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2b756b-c7db-44c5-97d8-b906e3d01e21-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.736328 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:10:22 crc kubenswrapper[4763]: I1205 12:10:22.748448 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-v7nqq"] Dec 05 12:10:23 crc kubenswrapper[4763]: I1205 12:10:23.586109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:23 crc kubenswrapper[4763]: I1205 12:10:23.586161 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:23 crc kubenswrapper[4763]: I1205 12:10:23.738144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:10:23 crc kubenswrapper[4763]: I1205 12:10:23.738471 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:10:23 crc kubenswrapper[4763]: I1205 12:10:23.799495 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" path="/var/lib/kubelet/pods/7e2b756b-c7db-44c5-97d8-b906e3d01e21/volumes" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.112662 4763 scope.go:117] "RemoveContainer" containerID="c4ebec2fb94c72e50f161f3b8221f0a8ddd831d2e0ad785f50e520df04a3bbd8" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.262975 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.291260 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wcz88" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.291746 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jhbq\" (UniqueName: \"kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq\") pod \"4046982d-ad27-468f-897a-167692d9ae49\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298210 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298247 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298367 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data\") pod \"4046982d-ad27-468f-897a-167692d9ae49\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6x4t\" (UniqueName: \"kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle\") pod \"4046982d-ad27-468f-897a-167692d9ae49\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298584 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data\") pod \"4046982d-ad27-468f-897a-167692d9ae49\" (UID: \"4046982d-ad27-468f-897a-167692d9ae49\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298613 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298631 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298698 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq7zm\" (UniqueName: \"kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data\") pod \"325fd5a5-7466-4539-9de3-8add5eb6996c\" (UID: \"325fd5a5-7466-4539-9de3-8add5eb6996c\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.298789 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts\") pod \"5d08538d-45d6-4f05-81a6-60ecc26dc593\" (UID: \"5d08538d-45d6-4f05-81a6-60ecc26dc593\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.300187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs" (OuterVolumeSpecName: "logs") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.303126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.308468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.308465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.309376 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4046982d-ad27-468f-897a-167692d9ae49" (UID: "4046982d-ad27-468f-897a-167692d9ae49"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.309810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq" (OuterVolumeSpecName: "kube-api-access-5jhbq") pod "4046982d-ad27-468f-897a-167692d9ae49" (UID: "4046982d-ad27-468f-897a-167692d9ae49"). InnerVolumeSpecName "kube-api-access-5jhbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.311885 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts" (OuterVolumeSpecName: "scripts") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.313024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.324351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm" (OuterVolumeSpecName: "kube-api-access-mq7zm") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "kube-api-access-mq7zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.325056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts" (OuterVolumeSpecName: "scripts") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.367023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t" (OuterVolumeSpecName: "kube-api-access-s6x4t") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "kube-api-access-s6x4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.386988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data" (OuterVolumeSpecName: "config-data") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419297 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419415 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419484 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq7zm\" (UniqueName: \"kubernetes.io/projected/325fd5a5-7466-4539-9de3-8add5eb6996c-kube-api-access-mq7zm\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419542 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419595 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.419646 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jhbq\" (UniqueName: \"kubernetes.io/projected/4046982d-ad27-468f-897a-167692d9ae49-kube-api-access-5jhbq\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.422988 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.423082 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.423156 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.423212 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6x4t\" (UniqueName: \"kubernetes.io/projected/5d08538d-45d6-4f05-81a6-60ecc26dc593-kube-api-access-s6x4t\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.423265 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325fd5a5-7466-4539-9de3-8add5eb6996c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.423324 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.428825 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d08538d-45d6-4f05-81a6-60ecc26dc593" (UID: "5d08538d-45d6-4f05-81a6-60ecc26dc593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.430621 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76d66464d-r24j6" event={"ID":"7e5e41f7-ee3c-4587-bae0-5716c12c84b6","Type":"ContainerStarted","Data":"c63fae021786f6d23d445e75064600df0fbf25717c9c3f5a05bc993cf7fea5a7"} Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.440910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4046982d-ad27-468f-897a-167692d9ae49" (UID: "4046982d-ad27-468f-897a-167692d9ae49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.441168 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tg56v" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.441198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tg56v" event={"ID":"5d08538d-45d6-4f05-81a6-60ecc26dc593","Type":"ContainerDied","Data":"3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844"} Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.441237 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ffef12a63f9fedd48e91798584c2fe3fc6a32b21afe8b182b63bfa384e5b844" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.450040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wcz88" event={"ID":"4046982d-ad27-468f-897a-167692d9ae49","Type":"ContainerDied","Data":"ed304d3164302c91cfe0a3ead8a3e42a7ed78a473c843ff56127a61d35194f9d"} Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.450085 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed304d3164302c91cfe0a3ead8a3e42a7ed78a473c843ff56127a61d35194f9d" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.450173 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wcz88" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.460956 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.461839 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.463774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325fd5a5-7466-4539-9de3-8add5eb6996c","Type":"ContainerDied","Data":"dd1c40dab7f9d51de14c9f235ba6e4fcf020586a788d7d52c06155606cb66f09"} Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.463868 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.489997 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data" (OuterVolumeSpecName: "config-data") pod "325fd5a5-7466-4539-9de3-8add5eb6996c" (UID: "325fd5a5-7466-4539-9de3-8add5eb6996c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.490756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data" (OuterVolumeSpecName: "config-data") pod "4046982d-ad27-468f-897a-167692d9ae49" (UID: "4046982d-ad27-468f-897a-167692d9ae49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526009 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526061 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526071 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d08538d-45d6-4f05-81a6-60ecc26dc593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526082 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526092 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325fd5a5-7466-4539-9de3-8add5eb6996c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.526102 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4046982d-ad27-468f-897a-167692d9ae49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.739181 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.816667 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824522 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4046982d-ad27-468f-897a-167692d9ae49" containerName="watcher-db-sync" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824586 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4046982d-ad27-468f-897a-167692d9ae49" containerName="watcher-db-sync" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824610 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="dnsmasq-dns" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824619 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="dnsmasq-dns" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824633 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824640 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824659 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="init" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824666 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="init" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824677 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824684 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824711 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d08538d-45d6-4f05-81a6-60ecc26dc593" containerName="keystone-bootstrap" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824719 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d08538d-45d6-4f05-81a6-60ecc26dc593" containerName="keystone-bootstrap" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824734 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824742 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: E1205 12:10:24.824754 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.824798 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825073 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2b756b-c7db-44c5-97d8-b906e3d01e21" containerName="dnsmasq-dns" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825087 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825107 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825121 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4046982d-ad27-468f-897a-167692d9ae49" containerName="watcher-db-sync" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825130 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d08538d-45d6-4f05-81a6-60ecc26dc593" containerName="keystone-bootstrap" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825147 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-httpd" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.825162 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" containerName="glance-log" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.835990 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836059 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqlxv\" (UniqueName: \"kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836145 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836168 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836196 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.836217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle\") pod \"f57aaff6-3e62-49f1-8055-60f0507d95ba\" (UID: \"f57aaff6-3e62-49f1-8055-60f0507d95ba\") " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.844536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs" (OuterVolumeSpecName: "logs") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.848605 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.858960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.864077 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.864290 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.879942 4763 scope.go:117] "RemoveContainer" containerID="d3ef703290f159546db5f4c50ab7c1a241f7d3080bd43947e7c9fef693358b35" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.889428 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-zcxcw" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.889635 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.890049 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv" (OuterVolumeSpecName: "kube-api-access-zqlxv") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "kube-api-access-zqlxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.891961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts" (OuterVolumeSpecName: "scripts") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.944954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76thp\" (UniqueName: \"kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945375 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945650 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945910 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945924 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945933 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqlxv\" (UniqueName: \"kubernetes.io/projected/f57aaff6-3e62-49f1-8055-60f0507d95ba-kube-api-access-zqlxv\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945946 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.945955 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f57aaff6-3e62-49f1-8055-60f0507d95ba-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.954632 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.963971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.969823 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.974979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.991918 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:24 crc kubenswrapper[4763]: I1205 12:10:24.997717 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.002029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.005198 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.007701 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.011939 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.023999 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.033720 4763 scope.go:117] "RemoveContainer" containerID="93690a3d1d0f8a59fae3e09fb8e16402baa56f64e6ece52b0e4d0777619c2759" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.035859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.047883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-config-data\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76thp\" (UniqueName: \"kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048168 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj7mh\" (UniqueName: \"kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048379 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-logs\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048410 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bblhh\" (UniqueName: \"kubernetes.io/projected/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-kube-api-access-bblhh\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048635 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.048661 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.049079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.049169 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.051172 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.054785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.055174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.055364 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.059778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.061349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.077139 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.078247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76thp\" (UniqueName: \"kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp\") pod \"watcher-api-0\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149686 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmwf\" (UniqueName: \"kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149729 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj7mh\" (UniqueName: \"kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149832 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149857 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-logs\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.149983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bblhh\" (UniqueName: \"kubernetes.io/projected/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-kube-api-access-bblhh\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150062 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.150134 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-config-data\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.153544 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-config-data\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.153825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.157112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.157313 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.157560 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-logs\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.161958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.162268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.171244 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj7mh\" (UniqueName: \"kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh\") pod \"watcher-decision-engine-0\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.171521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data" (OuterVolumeSpecName: "config-data") pod "f57aaff6-3e62-49f1-8055-60f0507d95ba" (UID: "f57aaff6-3e62-49f1-8055-60f0507d95ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.183963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bblhh\" (UniqueName: \"kubernetes.io/projected/8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe-kube-api-access-bblhh\") pod \"watcher-applier-0\" (UID: \"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe\") " pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.216422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.251909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmwf\" (UniqueName: \"kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.252654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.253048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.253171 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f57aaff6-3e62-49f1-8055-60f0507d95ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.254682 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.258945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.261368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.261993 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.262606 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.263704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.266571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.290224 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmwf\" (UniqueName: \"kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.290708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.335285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.363208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.435890 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7fc67b9475-mqldq"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.437462 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.444396 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.444603 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.444711 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.445375 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.445392 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7fc67b9475-mqldq"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.445426 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.445375 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gw888" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467378 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-scripts\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467457 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-config-data\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65pr\" (UniqueName: \"kubernetes.io/projected/2d728472-3cda-480b-b5dc-065969434f7d-kube-api-access-m65pr\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-credential-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-fernet-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-public-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467629 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-combined-ca-bundle\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.467668 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-internal-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.520047 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f57aaff6-3e62-49f1-8055-60f0507d95ba","Type":"ContainerDied","Data":"4cd275260f3988c76de37251927b02e2328630fb7ed5f3e0164985109cc94069"} Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.520097 4763 scope.go:117] "RemoveContainer" containerID="24fff5cc2fa368a5eacec4de1ed1e7c9ee48442324cbe5cd1a239a1ce3df48b2" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.520236 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.533483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76d66464d-r24j6" event={"ID":"7e5e41f7-ee3c-4587-bae0-5716c12c84b6","Type":"ContainerStarted","Data":"67ca2099c605c0d5e9dd89f75b07f3d526bf3c3b2392016079b83aad3402b5e1"} Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.534880 4763 generic.go:334] "Generic (PLEG): container finished" podID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerID="28da7ff0bc1ab19482e76d0dcbf645dff76f55c5d6c01a33e0beea80dab912ec" exitCode=0 Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.534932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" event={"ID":"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2","Type":"ContainerDied","Data":"28da7ff0bc1ab19482e76d0dcbf645dff76f55c5d6c01a33e0beea80dab912ec"} Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.537554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerStarted","Data":"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268"} Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.569064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-internal-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.570316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-scripts\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.571140 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-config-data\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.571276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65pr\" (UniqueName: \"kubernetes.io/projected/2d728472-3cda-480b-b5dc-065969434f7d-kube-api-access-m65pr\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.573191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-credential-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.573311 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-fernet-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.573473 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-public-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.574245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-combined-ca-bundle\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.580533 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-credential-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.580658 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-fernet-keys\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.582770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-public-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.590165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-scripts\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.590546 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-combined-ca-bundle\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.592343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-internal-tls-certs\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.593803 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d728472-3cda-480b-b5dc-065969434f7d-config-data\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.601562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65pr\" (UniqueName: \"kubernetes.io/projected/2d728472-3cda-480b-b5dc-065969434f7d-kube-api-access-m65pr\") pod \"keystone-7fc67b9475-mqldq\" (UID: \"2d728472-3cda-480b-b5dc-065969434f7d\") " pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.634498 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.698748 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.701665 4763 scope.go:117] "RemoveContainer" containerID="b95dadcd780efc2c26a68e211c2c22762742bef84110443e03a0fa1ec37f4d81" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.709735 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.725455 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.728046 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.731900 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.732106 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.756296 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.760100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq7zh\" (UniqueName: \"kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778612 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778671 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.778697 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.820336 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325fd5a5-7466-4539-9de3-8add5eb6996c" path="/var/lib/kubelet/pods/325fd5a5-7466-4539-9de3-8add5eb6996c/volumes" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.821679 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57aaff6-3e62-49f1-8055-60f0507d95ba" path="/var/lib/kubelet/pods/f57aaff6-3e62-49f1-8055-60f0507d95ba/volumes" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882456 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.882668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq7zh\" (UniqueName: \"kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.884276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.884683 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.884777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.893170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.901337 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.906944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.910575 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.923843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq7zh\" (UniqueName: \"kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:25 crc kubenswrapper[4763]: I1205 12:10:25.993828 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:26 crc kubenswrapper[4763]: W1205 12:10:26.000702 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff85a7c_5f5f_4c46_97c7_083ff89eb14c.slice/crio-639742d20b2102a81a29d21d5069354fa9336368aa0571d3e4943fa99c3c8ac1 WatchSource:0}: Error finding container 639742d20b2102a81a29d21d5069354fa9336368aa0571d3e4943fa99c3c8ac1: Status 404 returned error can't find the container with id 639742d20b2102a81a29d21d5069354fa9336368aa0571d3e4943fa99c3c8ac1 Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.153471 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.280680 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.297491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " pod="openstack/glance-default-external-api-0" Dec 05 12:10:26 crc kubenswrapper[4763]: W1205 12:10:26.331848 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bdc880b_2a69_4a8c_920f_6b0b04c7ecfe.slice/crio-603bb4c005a02b5bb8d23e9437d70f4bacbfb1e9ee72cdc43f09002721a27963 WatchSource:0}: Error finding container 603bb4c005a02b5bb8d23e9437d70f4bacbfb1e9ee72cdc43f09002721a27963: Status 404 returned error can't find the container with id 603bb4c005a02b5bb8d23e9437d70f4bacbfb1e9ee72cdc43f09002721a27963 Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.403107 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.515898 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7fc67b9475-mqldq"] Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.602799 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c","Type":"ContainerStarted","Data":"3b889b2787dcd87e2379b3e6711ad8f2860214ae4fd36e92f121a31a6402a3b9"} Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.608199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.613957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgf85" event={"ID":"1f3d2c51-7840-4854-af9e-e0da6c484074","Type":"ContainerStarted","Data":"d16f6dd1276dd9c47eb7962e5ac06677fb1281f7d620832bb907af2b23032ab4"} Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.655742 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wgf85" podStartSLOduration=3.306567467 podStartE2EDuration="47.655722453s" podCreationTimestamp="2025-12-05 12:09:39 +0000 UTC" firstStartedPulling="2025-12-05 12:09:40.599113639 +0000 UTC m=+1265.091828362" lastFinishedPulling="2025-12-05 12:10:24.948268625 +0000 UTC m=+1309.440983348" observedRunningTime="2025-12-05 12:10:26.638573654 +0000 UTC m=+1311.131288377" watchObservedRunningTime="2025-12-05 12:10:26.655722453 +0000 UTC m=+1311.148437176" Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.762043 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"3478236ff6a23e3307d9d198c5b689c7dc4f3629e735e36a3b801327a2c5c9c2"} Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.809063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76d66464d-r24j6" event={"ID":"7e5e41f7-ee3c-4587-bae0-5716c12c84b6","Type":"ContainerStarted","Data":"d283a64270615ad7d7eff73951a63b851c6c286535027cb465c6bae1833f4835"} Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.810564 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.810654 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.849096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b55c974d9-brgnw" event={"ID":"6d6c980e-688d-41b3-a7ad-0061b07b9494","Type":"ContainerStarted","Data":"7b35d8655be515216f1407d33960590bf19ae8058f988275384da0fca8b3dc00"} Dec 05 12:10:26 crc kubenswrapper[4763]: I1205 12:10:26.921474 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76d66464d-r24j6" podStartSLOduration=10.921454553 podStartE2EDuration="10.921454553s" podCreationTimestamp="2025-12-05 12:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:26.872576444 +0000 UTC m=+1311.365291177" watchObservedRunningTime="2025-12-05 12:10:26.921454553 +0000 UTC m=+1311.414169276" Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.007259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerStarted","Data":"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27"} Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.015957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerStarted","Data":"639742d20b2102a81a29d21d5069354fa9336368aa0571d3e4943fa99c3c8ac1"} Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.017609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe","Type":"ContainerStarted","Data":"603bb4c005a02b5bb8d23e9437d70f4bacbfb1e9ee72cdc43f09002721a27963"} Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.019399 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7fc67b9475-mqldq" event={"ID":"2d728472-3cda-480b-b5dc-065969434f7d","Type":"ContainerStarted","Data":"c4a39e47c614c43913c0912d2a5b5f3fb26ad4e738ec7c5749901ad4709cc552"} Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.069356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerStarted","Data":"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f"} Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.070529 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.120068 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dc5f79d94-t8x4q" podStartSLOduration=12.120050031 podStartE2EDuration="12.120050031s" podCreationTimestamp="2025-12-05 12:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:27.104071747 +0000 UTC m=+1311.596786480" watchObservedRunningTime="2025-12-05 12:10:27.120050031 +0000 UTC m=+1311.612764744" Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.410754 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:10:27 crc kubenswrapper[4763]: W1205 12:10:27.441840 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06f0e50f_35b7_441d_a630_0655b4c1cd00.slice/crio-d29a6f5cae9788281c3661b53406b36706c72b23ac49b60e6e384e1357c6d39b WatchSource:0}: Error finding container d29a6f5cae9788281c3661b53406b36706c72b23ac49b60e6e384e1357c6d39b: Status 404 returned error can't find the container with id d29a6f5cae9788281c3661b53406b36706c72b23ac49b60e6e384e1357c6d39b Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.719100 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7dc58bc884-khdbv" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 05 12:10:27 crc kubenswrapper[4763]: I1205 12:10:27.778800 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77dcd5c496-hs7bj" podUID="b34428a2-5423-401a-b7d3-aebd1d070945" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Dec 05 12:10:28 crc kubenswrapper[4763]: I1205 12:10:28.083793 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerStarted","Data":"35d483e9bf8d24b0e940c41602403f042b67ff7ba28ded520e48cc008f6e069c"} Dec 05 12:10:28 crc kubenswrapper[4763]: I1205 12:10:28.085534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerStarted","Data":"d29a6f5cae9788281c3661b53406b36706c72b23ac49b60e6e384e1357c6d39b"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.130832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b55c974d9-brgnw" event={"ID":"6d6c980e-688d-41b3-a7ad-0061b07b9494","Type":"ContainerStarted","Data":"e87b855b5f539fa53726813adfee16ac272edecc9d4e03ae2a2e434a27e9acbe"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.132007 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.135442 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cprq9" event={"ID":"10d49525-ec1b-4c52-8221-f3f0bb57e574","Type":"ContainerStarted","Data":"9eb76be718c1b0dadda12f48e28116e1573a11c11d9ec171fe421d6538f25f86"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.138180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerStarted","Data":"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.141654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" event={"ID":"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2","Type":"ContainerStarted","Data":"ac9a0ca04d43a2b4e83535243429f1e5e917c180a3768e368cbb113ac1b89f13"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.141951 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.143197 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7fc67b9475-mqldq" event={"ID":"2d728472-3cda-480b-b5dc-065969434f7d","Type":"ContainerStarted","Data":"9b0f50a678dca0c6297e607b477ab6c7f3c66ba9195ce13dc3150053618bd327"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.143720 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.145979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerStarted","Data":"9ee7f0116aec2115bfd52c5b0bd68412ad360aa2366959820ade81f6c5c53e48"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.149025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerStarted","Data":"fada1b6bd3de60187d25cd6ce64a503d42a5a289412a9ad93e63575c6aecf799"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.154075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"972d950e5147f05a310871cfffeabe9906aa937059416fb569327f8af1d49959"} Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.172098 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b55c974d9-brgnw" podStartSLOduration=12.172081062 podStartE2EDuration="12.172081062s" podCreationTimestamp="2025-12-05 12:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:29.163504732 +0000 UTC m=+1313.656219475" watchObservedRunningTime="2025-12-05 12:10:29.172081062 +0000 UTC m=+1313.664795785" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.191825 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" podStartSLOduration=14.19180806 podStartE2EDuration="14.19180806s" podCreationTimestamp="2025-12-05 12:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:29.183394287 +0000 UTC m=+1313.676109030" watchObservedRunningTime="2025-12-05 12:10:29.19180806 +0000 UTC m=+1313.684522783" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.203426 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7fc67b9475-mqldq" podStartSLOduration=4.203408507 podStartE2EDuration="4.203408507s" podCreationTimestamp="2025-12-05 12:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:29.199541948 +0000 UTC m=+1313.692256661" watchObservedRunningTime="2025-12-05 12:10:29.203408507 +0000 UTC m=+1313.696123230" Dec 05 12:10:29 crc kubenswrapper[4763]: I1205 12:10:29.230028 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-cprq9" podStartSLOduration=5.806593518 podStartE2EDuration="51.23001225s" podCreationTimestamp="2025-12-05 12:09:38 +0000 UTC" firstStartedPulling="2025-12-05 12:09:40.332664221 +0000 UTC m=+1264.825378944" lastFinishedPulling="2025-12-05 12:10:25.756082953 +0000 UTC m=+1310.248797676" observedRunningTime="2025-12-05 12:10:29.216672847 +0000 UTC m=+1313.709387580" watchObservedRunningTime="2025-12-05 12:10:29.23001225 +0000 UTC m=+1313.722726973" Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.191680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe","Type":"ContainerStarted","Data":"9b347b134bad06143215d49e0f05d01451fe41dec0663d88953b673dbde5544b"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.203626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerStarted","Data":"13075e4233d39d8a765a3b36d2e3a81e1ec2314c209cc3d5ab6cff59152f353f"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.215018 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.30619238 podStartE2EDuration="7.214999221s" podCreationTimestamp="2025-12-05 12:10:24 +0000 UTC" firstStartedPulling="2025-12-05 12:10:26.34074949 +0000 UTC m=+1310.833464203" lastFinishedPulling="2025-12-05 12:10:30.249556321 +0000 UTC m=+1314.742271044" observedRunningTime="2025-12-05 12:10:31.213715852 +0000 UTC m=+1315.706430585" watchObservedRunningTime="2025-12-05 12:10:31.214999221 +0000 UTC m=+1315.707713954" Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.242223 4763 generic.go:334] "Generic (PLEG): container finished" podID="1f3d2c51-7840-4854-af9e-e0da6c484074" containerID="d16f6dd1276dd9c47eb7962e5ac06677fb1281f7d620832bb907af2b23032ab4" exitCode=0 Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.242326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgf85" event={"ID":"1f3d2c51-7840-4854-af9e-e0da6c484074","Type":"ContainerDied","Data":"d16f6dd1276dd9c47eb7962e5ac06677fb1281f7d620832bb907af2b23032ab4"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.245232 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.245215874 podStartE2EDuration="7.245215874s" podCreationTimestamp="2025-12-05 12:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:31.241377296 +0000 UTC m=+1315.734092039" watchObservedRunningTime="2025-12-05 12:10:31.245215874 +0000 UTC m=+1315.737930597" Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.255036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerStarted","Data":"7442313666db8517b1aed2091d3065563ee73742e77582ff15eae44d27a742bb"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.300824 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.300735869 podStartE2EDuration="6.300735869s" podCreationTimestamp="2025-12-05 12:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:31.299104646 +0000 UTC m=+1315.791819369" watchObservedRunningTime="2025-12-05 12:10:31.300735869 +0000 UTC m=+1315.793450592" Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.311306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"b03ec231175fca5ec9ebc3e1907e079f54d236d52e46132ea5c9daab70253ee3"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.311359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"bc3969bb8965d6039642fae71a53bcbc0d5b7d22fb7f6cf1d40e72fad9f68186"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.311372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"532d49468ad273bff88bd503970616568a916141e5815c1a263ea87176ec53b7"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.313118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c","Type":"ContainerStarted","Data":"e8aa557f91a9fbdce97324156f041bf924b254be96744ed33a919ee6fa4b70b1"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.349199 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.339908436 podStartE2EDuration="7.349180822s" podCreationTimestamp="2025-12-05 12:10:24 +0000 UTC" firstStartedPulling="2025-12-05 12:10:26.25860369 +0000 UTC m=+1310.751318413" lastFinishedPulling="2025-12-05 12:10:30.267876076 +0000 UTC m=+1314.760590799" observedRunningTime="2025-12-05 12:10:31.344274393 +0000 UTC m=+1315.836989116" watchObservedRunningTime="2025-12-05 12:10:31.349180822 +0000 UTC m=+1315.841895545" Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.363425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerStarted","Data":"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848"} Dec 05 12:10:31 crc kubenswrapper[4763]: I1205 12:10:31.363910 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.726952 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgf85" Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.754956 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=8.754935557 podStartE2EDuration="8.754935557s" podCreationTimestamp="2025-12-05 12:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:31.385066052 +0000 UTC m=+1315.877780795" watchObservedRunningTime="2025-12-05 12:10:32.754935557 +0000 UTC m=+1317.247650290" Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.905343 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle\") pod \"1f3d2c51-7840-4854-af9e-e0da6c484074\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.905469 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgsw7\" (UniqueName: \"kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7\") pod \"1f3d2c51-7840-4854-af9e-e0da6c484074\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.905610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data\") pod \"1f3d2c51-7840-4854-af9e-e0da6c484074\" (UID: \"1f3d2c51-7840-4854-af9e-e0da6c484074\") " Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.912504 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7" (OuterVolumeSpecName: "kube-api-access-zgsw7") pod "1f3d2c51-7840-4854-af9e-e0da6c484074" (UID: "1f3d2c51-7840-4854-af9e-e0da6c484074"). InnerVolumeSpecName "kube-api-access-zgsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:32 crc kubenswrapper[4763]: I1205 12:10:32.931661 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f3d2c51-7840-4854-af9e-e0da6c484074" (UID: "1f3d2c51-7840-4854-af9e-e0da6c484074"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.002881 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f3d2c51-7840-4854-af9e-e0da6c484074" (UID: "1f3d2c51-7840-4854-af9e-e0da6c484074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.008443 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.008479 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3d2c51-7840-4854-af9e-e0da6c484074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.008491 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgsw7\" (UniqueName: \"kubernetes.io/projected/1f3d2c51-7840-4854-af9e-e0da6c484074-kube-api-access-zgsw7\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.394468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgf85" event={"ID":"1f3d2c51-7840-4854-af9e-e0da6c484074","Type":"ContainerDied","Data":"07047094bb3dee3d5aded49846bd8646907be1c814de1542fc99c916da925e27"} Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.394504 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07047094bb3dee3d5aded49846bd8646907be1c814de1542fc99c916da925e27" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.394569 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgf85" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.424595 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"a5b60a7a8458e7f89325970a5a18aa17ab64aba17d991ddf187597cc56ec71e1"} Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.424641 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1851124e-2722-4628-8e5b-63edb828d64a","Type":"ContainerStarted","Data":"7e64bbbd02e4e3a3f75ff20f55e733d127c16cc32915d3509e5a36cde0426548"} Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.481102 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=74.694623948 podStartE2EDuration="1m58.481079324s" podCreationTimestamp="2025-12-05 12:08:35 +0000 UTC" firstStartedPulling="2025-12-05 12:09:41.145226881 +0000 UTC m=+1265.637941604" lastFinishedPulling="2025-12-05 12:10:24.931682257 +0000 UTC m=+1309.424396980" observedRunningTime="2025-12-05 12:10:33.470777778 +0000 UTC m=+1317.963492501" watchObservedRunningTime="2025-12-05 12:10:33.481079324 +0000 UTC m=+1317.973794057" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.557696 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5446b6d8dc-p784q"] Dec 05 12:10:33 crc kubenswrapper[4763]: E1205 12:10:33.558203 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" containerName="barbican-db-sync" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.558228 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" containerName="barbican-db-sync" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.559007 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" containerName="barbican-db-sync" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.560445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.564361 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.565363 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.567306 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrxzg" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.578403 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-94d865894-tqt5m"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.580017 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.588773 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.602978 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5446b6d8dc-p784q"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.626521 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94d865894-tqt5m"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.721282 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.721505 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" containerID="cri-o://ac9a0ca04d43a2b4e83535243429f1e5e917c180a3768e368cbb113ac1b89f13" gracePeriod=10 Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.726896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data-custom\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.726949 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727010 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb969\" (UniqueName: \"kubernetes.io/projected/76cf4acb-9763-4dac-9a2f-eba4a98314f0-kube-api-access-fb969\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d23af5c6-295f-4c65-90a1-02e66a41f325-logs\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-combined-ca-bundle\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data-custom\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprn8\" (UniqueName: \"kubernetes.io/projected/d23af5c6-295f-4c65-90a1-02e66a41f325-kube-api-access-dprn8\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cf4acb-9763-4dac-9a2f-eba4a98314f0-logs\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727239 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-combined-ca-bundle\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.727261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.729955 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.759392 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.761228 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.782861 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb969\" (UniqueName: \"kubernetes.io/projected/76cf4acb-9763-4dac-9a2f-eba4a98314f0-kube-api-access-fb969\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d23af5c6-295f-4c65-90a1-02e66a41f325-logs\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830901 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wskq\" (UniqueName: \"kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-combined-ca-bundle\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830964 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data-custom\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dprn8\" (UniqueName: \"kubernetes.io/projected/d23af5c6-295f-4c65-90a1-02e66a41f325-kube-api-access-dprn8\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.830997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cf4acb-9763-4dac-9a2f-eba4a98314f0-logs\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-combined-ca-bundle\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data-custom\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.831994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d23af5c6-295f-4c65-90a1-02e66a41f325-logs\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.836954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cf4acb-9763-4dac-9a2f-eba4a98314f0-logs\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.839846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data-custom\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.843683 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-combined-ca-bundle\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.847160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-combined-ca-bundle\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.855961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data-custom\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.856291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cf4acb-9763-4dac-9a2f-eba4a98314f0-config-data\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.859630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23af5c6-295f-4c65-90a1-02e66a41f325-config-data\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.872725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprn8\" (UniqueName: \"kubernetes.io/projected/d23af5c6-295f-4c65-90a1-02e66a41f325-kube-api-access-dprn8\") pod \"barbican-worker-5446b6d8dc-p784q\" (UID: \"d23af5c6-295f-4c65-90a1-02e66a41f325\") " pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.889421 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb969\" (UniqueName: \"kubernetes.io/projected/76cf4acb-9763-4dac-9a2f-eba4a98314f0-kube-api-access-fb969\") pod \"barbican-keystone-listener-94d865894-tqt5m\" (UID: \"76cf4acb-9763-4dac-9a2f-eba4a98314f0\") " pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.933122 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5446b6d8dc-p784q" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.934234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.934310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.934343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.934385 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.934421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wskq\" (UniqueName: \"kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.935460 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.935547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.936176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.936739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.937847 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.940300 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.943969 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.964396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:10:33 crc kubenswrapper[4763]: I1205 12:10:33.965441 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wskq\" (UniqueName: \"kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq\") pod \"dnsmasq-dns-798d46d59c-2cbs9\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.000019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.022481 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.023445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.038345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.038408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.038512 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcrp\" (UniqueName: \"kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.038537 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.038569 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.048834 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.050490 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.064292 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.081003 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.140683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.140822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.140885 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.140991 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141263 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnvgv\" (UniqueName: \"kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141505 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcrp\" (UniqueName: \"kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.141602 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.144447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.145214 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.148553 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.148700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.161866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcrp\" (UniqueName: \"kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp\") pod \"barbican-api-758b9f6874-2lcsg\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.243948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.244043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.244114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.244138 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnvgv\" (UniqueName: \"kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.244190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.244305 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.245199 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.245748 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.246629 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.249630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.253282 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.263291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnvgv\" (UniqueName: \"kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv\") pod \"dnsmasq-dns-848cf88cfc-ssc9w\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.397088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.399021 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.409812 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.442885 4763 generic.go:334] "Generic (PLEG): container finished" podID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerID="ac9a0ca04d43a2b4e83535243429f1e5e917c180a3768e368cbb113ac1b89f13" exitCode=0 Dec 05 12:10:34 crc kubenswrapper[4763]: I1205 12:10:34.442963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" event={"ID":"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2","Type":"ContainerDied","Data":"ac9a0ca04d43a2b4e83535243429f1e5e917c180a3768e368cbb113ac1b89f13"} Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.217125 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.218227 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.228009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.293047 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.336947 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.337005 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.339832 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.385612 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.455560 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.459893 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.500420 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.501504 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.638442 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.638487 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.687378 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.717984 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:35 crc kubenswrapper[4763]: I1205 12:10:35.897939 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: connect: connection refused" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.406067 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.406128 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.473010 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.473085 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.498538 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.498880 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.536009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.894670 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f846788f8-4gznp"] Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.896424 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.902165 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.902497 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 12:10:36 crc kubenswrapper[4763]: I1205 12:10:36.997060 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f846788f8-4gznp"] Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-public-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c384c8-1d13-47af-b978-f724e40e99af-logs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqp9m\" (UniqueName: \"kubernetes.io/projected/80c384c8-1d13-47af-b978-f724e40e99af-kube-api-access-jqp9m\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data-custom\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043270 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-internal-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-combined-ca-bundle\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.043361 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c384c8-1d13-47af-b978-f724e40e99af-logs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144607 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqp9m\" (UniqueName: \"kubernetes.io/projected/80c384c8-1d13-47af-b978-f724e40e99af-kube-api-access-jqp9m\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-internal-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data-custom\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-combined-ca-bundle\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.144792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-public-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.153454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-internal-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.153749 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c384c8-1d13-47af-b978-f724e40e99af-logs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.171392 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-combined-ca-bundle\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.175453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-public-tls-certs\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.177431 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data-custom\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.177709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c384c8-1d13-47af-b978-f724e40e99af-config-data\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.187594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqp9m\" (UniqueName: \"kubernetes.io/projected/80c384c8-1d13-47af-b978-f724e40e99af-kube-api-access-jqp9m\") pod \"barbican-api-7f846788f8-4gznp\" (UID: \"80c384c8-1d13-47af-b978-f724e40e99af\") " pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.294787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.489033 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:10:37 crc kubenswrapper[4763]: I1205 12:10:37.694392 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7dc58bc884-khdbv" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 05 12:10:38 crc kubenswrapper[4763]: I1205 12:10:38.498394 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:10:38 crc kubenswrapper[4763]: I1205 12:10:38.498682 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:10:38 crc kubenswrapper[4763]: I1205 12:10:38.498394 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:10:38 crc kubenswrapper[4763]: I1205 12:10:38.930596 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 12:10:38 crc kubenswrapper[4763]: I1205 12:10:38.934609 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 12:10:39 crc kubenswrapper[4763]: I1205 12:10:39.418898 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:39 crc kubenswrapper[4763]: I1205 12:10:39.432088 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 12:10:39 crc kubenswrapper[4763]: I1205 12:10:39.924223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:10:40 crc kubenswrapper[4763]: I1205 12:10:40.635800 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:40 crc kubenswrapper[4763]: I1205 12:10:40.636030 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api-log" containerID="cri-o://680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f" gracePeriod=30 Dec 05 12:10:40 crc kubenswrapper[4763]: I1205 12:10:40.636161 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api" containerID="cri-o://94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848" gracePeriod=30 Dec 05 12:10:40 crc kubenswrapper[4763]: I1205 12:10:40.893917 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: connect: connection refused" Dec 05 12:10:41 crc kubenswrapper[4763]: I1205 12:10:41.805261 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77dcd5c496-hs7bj" Dec 05 12:10:41 crc kubenswrapper[4763]: I1205 12:10:41.884947 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:10:41 crc kubenswrapper[4763]: I1205 12:10:41.885213 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7dc58bc884-khdbv" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon-log" containerID="cri-o://0f9fce0cd10290e14fb000bbba92b5ee8865c914d6ce43d7be912c918930bcb6" gracePeriod=30 Dec 05 12:10:41 crc kubenswrapper[4763]: I1205 12:10:41.885839 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7dc58bc884-khdbv" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" containerID="cri-o://db7f9809d3f9be8f6f6d0b3eccb2f6c7b682e2f5b570dcc420bac86ce64e99f5" gracePeriod=30 Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.549476 4763 generic.go:334] "Generic (PLEG): container finished" podID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerID="680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f" exitCode=143 Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.549487 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerDied","Data":"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f"} Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.560644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" event={"ID":"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2","Type":"ContainerDied","Data":"45c8940fcd5d1101e4828a4aae4b850cdf67eab7999ef457e7171fc8a020af40"} Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.560686 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c8940fcd5d1101e4828a4aae4b850cdf67eab7999ef457e7171fc8a020af40" Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.579564 4763 generic.go:334] "Generic (PLEG): container finished" podID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerID="db7f9809d3f9be8f6f6d0b3eccb2f6c7b682e2f5b570dcc420bac86ce64e99f5" exitCode=0 Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.580060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerDied","Data":"db7f9809d3f9be8f6f6d0b3eccb2f6c7b682e2f5b570dcc420bac86ce64e99f5"} Dec 05 12:10:42 crc kubenswrapper[4763]: I1205 12:10:42.858620 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.004436 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config\") pod \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.004498 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb\") pod \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.004693 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb\") pod \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.004733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc\") pod \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.004772 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mml2z\" (UniqueName: \"kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z\") pod \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\" (UID: \"c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2\") " Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.017426 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z" (OuterVolumeSpecName: "kube-api-access-mml2z") pod "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" (UID: "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"). InnerVolumeSpecName "kube-api-access-mml2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.104676 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config" (OuterVolumeSpecName: "config") pod "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" (UID: "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.105542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" (UID: "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.106266 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.106299 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mml2z\" (UniqueName: \"kubernetes.io/projected/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-kube-api-access-mml2z\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.106315 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.117698 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" (UID: "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.129542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" (UID: "c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.208952 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.208986 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:43 crc kubenswrapper[4763]: E1205 12:10:43.217326 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.305594 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.333817 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:43 crc kubenswrapper[4763]: W1205 12:10:43.341018 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0bb3232_022f_4f19_89e7_374ae90d4dd3.slice/crio-c8dd3b9278d68e7e36e46f19835d9da557919f85e6e0bec979c36bbc5f19758c WatchSource:0}: Error finding container c8dd3b9278d68e7e36e46f19835d9da557919f85e6e0bec979c36bbc5f19758c: Status 404 returned error can't find the container with id c8dd3b9278d68e7e36e46f19835d9da557919f85e6e0bec979c36bbc5f19758c Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.341642 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:43 crc kubenswrapper[4763]: W1205 12:10:43.347695 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3337c2a_1707_4b41_89e9_563b51024eed.slice/crio-0c0890a5e49de23ee3ed2bf0df58f726c4722d8fc901c7fb6b57d036c4232830 WatchSource:0}: Error finding container 0c0890a5e49de23ee3ed2bf0df58f726c4722d8fc901c7fb6b57d036c4232830: Status 404 returned error can't find the container with id 0c0890a5e49de23ee3ed2bf0df58f726c4722d8fc901c7fb6b57d036c4232830 Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.350888 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5446b6d8dc-p784q"] Dec 05 12:10:43 crc kubenswrapper[4763]: W1205 12:10:43.351925 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd23af5c6_295f_4c65_90a1_02e66a41f325.slice/crio-190630eb9128f04f7cb98572686b0a21d163542081c94dd9f924a766b1caee5c WatchSource:0}: Error finding container 190630eb9128f04f7cb98572686b0a21d163542081c94dd9f924a766b1caee5c: Status 404 returned error can't find the container with id 190630eb9128f04f7cb98572686b0a21d163542081c94dd9f924a766b1caee5c Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.431577 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94d865894-tqt5m"] Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.442843 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f846788f8-4gznp"] Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.595534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerStarted","Data":"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.595651 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.595594 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="sg-core" containerID="cri-o://cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27" gracePeriod=30 Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.595559 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="ceilometer-notification-agent" containerID="cri-o://6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444" gracePeriod=30 Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.595604 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="proxy-httpd" containerID="cri-o://2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8" gracePeriod=30 Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.602618 4763 generic.go:334] "Generic (PLEG): container finished" podID="10d49525-ec1b-4c52-8221-f3f0bb57e574" containerID="9eb76be718c1b0dadda12f48e28116e1573a11c11d9ec171fe421d6538f25f86" exitCode=0 Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.602901 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cprq9" event={"ID":"10d49525-ec1b-4c52-8221-f3f0bb57e574","Type":"ContainerDied","Data":"9eb76be718c1b0dadda12f48e28116e1573a11c11d9ec171fe421d6538f25f86"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.609002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" event={"ID":"f3337c2a-1707-4b41-89e9-563b51024eed","Type":"ContainerStarted","Data":"0c0890a5e49de23ee3ed2bf0df58f726c4722d8fc901c7fb6b57d036c4232830"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.610141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5446b6d8dc-p784q" event={"ID":"d23af5c6-295f-4c65-90a1-02e66a41f325","Type":"ContainerStarted","Data":"190630eb9128f04f7cb98572686b0a21d163542081c94dd9f924a766b1caee5c"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.618711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerStarted","Data":"6e8f58885bc169f9772af5c4bc947eef4345bfb60959ae68f4f25616b6735a1d"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.618781 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerStarted","Data":"bd2a91405b594db5d486f8d5427ee117b21192361eff7d5d0e1d0e232255dae8"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.623254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f846788f8-4gznp" event={"ID":"80c384c8-1d13-47af-b978-f724e40e99af","Type":"ContainerStarted","Data":"3a795754af0a7db9f8f8024d0b1ade4444add85986e88ce573d975d537da0d74"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.625222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" event={"ID":"76cf4acb-9763-4dac-9a2f-eba4a98314f0","Type":"ContainerStarted","Data":"55da1925dc9581e7f9b1409d2f78f8d2e726e6fec2fe3aec6c06e406f7f08558"} Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.631742 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:10:43 crc kubenswrapper[4763]: I1205 12:10:43.634683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" event={"ID":"c0bb3232-022f-4f19-89e7-374ae90d4dd3","Type":"ContainerStarted","Data":"c8dd3b9278d68e7e36e46f19835d9da557919f85e6e0bec979c36bbc5f19758c"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.194613 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.330804 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle\") pod \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.330910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca\") pod \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.330984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs\") pod \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.331023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76thp\" (UniqueName: \"kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp\") pod \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.331090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data\") pod \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\" (UID: \"aff85a7c-5f5f-4c46-97c7-083ff89eb14c\") " Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.332220 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs" (OuterVolumeSpecName: "logs") pod "aff85a7c-5f5f-4c46-97c7-083ff89eb14c" (UID: "aff85a7c-5f5f-4c46-97c7-083ff89eb14c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.336803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp" (OuterVolumeSpecName: "kube-api-access-76thp") pod "aff85a7c-5f5f-4c46-97c7-083ff89eb14c" (UID: "aff85a7c-5f5f-4c46-97c7-083ff89eb14c"). InnerVolumeSpecName "kube-api-access-76thp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.375975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aff85a7c-5f5f-4c46-97c7-083ff89eb14c" (UID: "aff85a7c-5f5f-4c46-97c7-083ff89eb14c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.379306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aff85a7c-5f5f-4c46-97c7-083ff89eb14c" (UID: "aff85a7c-5f5f-4c46-97c7-083ff89eb14c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.404913 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data" (OuterVolumeSpecName: "config-data") pod "aff85a7c-5f5f-4c46-97c7-083ff89eb14c" (UID: "aff85a7c-5f5f-4c46-97c7-083ff89eb14c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.433322 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.433369 4763 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.433382 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.433393 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76thp\" (UniqueName: \"kubernetes.io/projected/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-kube-api-access-76thp\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.433407 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff85a7c-5f5f-4c46-97c7-083ff89eb14c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.644712 4763 generic.go:334] "Generic (PLEG): container finished" podID="f3337c2a-1707-4b41-89e9-563b51024eed" containerID="09c8a182a6da642916653c1fc6883d8217bfa23bb44563453c43788ae8363682" exitCode=0 Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.644811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" event={"ID":"f3337c2a-1707-4b41-89e9-563b51024eed","Type":"ContainerDied","Data":"09c8a182a6da642916653c1fc6883d8217bfa23bb44563453c43788ae8363682"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.648248 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerID="2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8" exitCode=0 Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.648282 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerID="cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27" exitCode=2 Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.648288 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerDied","Data":"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.648336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerDied","Data":"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.650511 4763 generic.go:334] "Generic (PLEG): container finished" podID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerID="94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848" exitCode=0 Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.650554 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.650585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerDied","Data":"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.650620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aff85a7c-5f5f-4c46-97c7-083ff89eb14c","Type":"ContainerDied","Data":"639742d20b2102a81a29d21d5069354fa9336368aa0571d3e4943fa99c3c8ac1"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.650639 4763 scope.go:117] "RemoveContainer" containerID="94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.654262 4763 generic.go:334] "Generic (PLEG): container finished" podID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerID="29965db590b623057df986dbdec1f58d92427771f1b8c9748fe4d09396c955f1" exitCode=0 Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.654339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" event={"ID":"c0bb3232-022f-4f19-89e7-374ae90d4dd3","Type":"ContainerDied","Data":"29965db590b623057df986dbdec1f58d92427771f1b8c9748fe4d09396c955f1"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.668465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerStarted","Data":"230cb231d868037183ea0f6909557c737de8a2c4c2c301798612f96ca1075a0e"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.669337 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.669363 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.672038 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.672078 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.674699 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f846788f8-4gznp" event={"ID":"80c384c8-1d13-47af-b978-f724e40e99af","Type":"ContainerStarted","Data":"408e37568e17a96f08aac1128913f883089d04f02cfdcd9f3c38843989e644ff"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.674875 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f846788f8-4gznp" event={"ID":"80c384c8-1d13-47af-b978-f724e40e99af","Type":"ContainerStarted","Data":"94270b80e14b04932e8985eb1df984c42c4b0efd432870b901b2582a14b17d79"} Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.715503 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f846788f8-4gznp" podStartSLOduration=8.715484997 podStartE2EDuration="8.715484997s" podCreationTimestamp="2025-12-05 12:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:44.700843414 +0000 UTC m=+1329.193558147" watchObservedRunningTime="2025-12-05 12:10:44.715484997 +0000 UTC m=+1329.208199730" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.726516 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.747256 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.761950 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:44 crc kubenswrapper[4763]: E1205 12:10:44.762476 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api-log" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762510 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api-log" Dec 05 12:10:44 crc kubenswrapper[4763]: E1205 12:10:44.762544 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762552 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" Dec 05 12:10:44 crc kubenswrapper[4763]: E1205 12:10:44.762576 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762583 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api" Dec 05 12:10:44 crc kubenswrapper[4763]: E1205 12:10:44.762603 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="init" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762620 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="init" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762847 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api-log" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762874 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" containerName="dnsmasq-dns" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.762888 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" containerName="watcher-api" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.764296 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.768057 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-758b9f6874-2lcsg" podStartSLOduration=11.768016427 podStartE2EDuration="11.768016427s" podCreationTimestamp="2025-12-05 12:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:44.73638238 +0000 UTC m=+1329.229097103" watchObservedRunningTime="2025-12-05 12:10:44.768016427 +0000 UTC m=+1329.260731150" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.769450 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.769532 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.769638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.822028 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.845299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dftn\" (UniqueName: \"kubernetes.io/projected/37a4b06b-53bd-4f53-89b7-4d5a53554510-kube-api-access-8dftn\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.845737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.845963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-public-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.846129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-config-data\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.846267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.846393 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a4b06b-53bd-4f53-89b7-4d5a53554510-logs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.846535 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948545 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-public-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948672 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-config-data\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948710 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a4b06b-53bd-4f53-89b7-4d5a53554510-logs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.948890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dftn\" (UniqueName: \"kubernetes.io/projected/37a4b06b-53bd-4f53-89b7-4d5a53554510-kube-api-access-8dftn\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.953051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.953243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a4b06b-53bd-4f53-89b7-4d5a53554510-logs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.965081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-config-data\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.966842 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-public-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.967184 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.967959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a4b06b-53bd-4f53-89b7-4d5a53554510-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:44 crc kubenswrapper[4763]: I1205 12:10:44.968590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dftn\" (UniqueName: \"kubernetes.io/projected/37a4b06b-53bd-4f53-89b7-4d5a53554510-kube-api-access-8dftn\") pod \"watcher-api-0\" (UID: \"37a4b06b-53bd-4f53-89b7-4d5a53554510\") " pod="openstack/watcher-api-0" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.083291 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.277854 4763 scope.go:117] "RemoveContainer" containerID="680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.290916 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.297117 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cprq9" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.354010 4763 scope.go:117] "RemoveContainer" containerID="94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848" Dec 05 12:10:45 crc kubenswrapper[4763]: E1205 12:10:45.355717 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848\": container with ID starting with 94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848 not found: ID does not exist" containerID="94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.355774 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848"} err="failed to get container status \"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848\": rpc error: code = NotFound desc = could not find container \"94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848\": container with ID starting with 94f830dd6b75d75c984906bebde78ae47e3ba92d38471b016cc49f49593b7848 not found: ID does not exist" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.355803 4763 scope.go:117] "RemoveContainer" containerID="680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f" Dec 05 12:10:45 crc kubenswrapper[4763]: E1205 12:10:45.356145 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f\": container with ID starting with 680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f not found: ID does not exist" containerID="680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.356205 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f"} err="failed to get container status \"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f\": rpc error: code = NotFound desc = could not find container \"680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f\": container with ID starting with 680b4f4c2767a73f635d3acd2d356b5fe51f3b2d53b72a75bb6a529a2e2bdc7f not found: ID does not exist" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492vf\" (UniqueName: \"kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357798 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config\") pod \"f3337c2a-1707-4b41-89e9-563b51024eed\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc\") pod \"f3337c2a-1707-4b41-89e9-563b51024eed\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357942 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wskq\" (UniqueName: \"kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq\") pod \"f3337c2a-1707-4b41-89e9-563b51024eed\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.357981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb\") pod \"f3337c2a-1707-4b41-89e9-563b51024eed\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.358024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.358093 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.358127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data\") pod \"10d49525-ec1b-4c52-8221-f3f0bb57e574\" (UID: \"10d49525-ec1b-4c52-8221-f3f0bb57e574\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.358183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb\") pod \"f3337c2a-1707-4b41-89e9-563b51024eed\" (UID: \"f3337c2a-1707-4b41-89e9-563b51024eed\") " Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.358568 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10d49525-ec1b-4c52-8221-f3f0bb57e574-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.367941 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf" (OuterVolumeSpecName: "kube-api-access-492vf") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "kube-api-access-492vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.372067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.372250 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts" (OuterVolumeSpecName: "scripts") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.376608 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq" (OuterVolumeSpecName: "kube-api-access-5wskq") pod "f3337c2a-1707-4b41-89e9-563b51024eed" (UID: "f3337c2a-1707-4b41-89e9-563b51024eed"). InnerVolumeSpecName "kube-api-access-5wskq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.396519 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config" (OuterVolumeSpecName: "config") pod "f3337c2a-1707-4b41-89e9-563b51024eed" (UID: "f3337c2a-1707-4b41-89e9-563b51024eed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.408618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3337c2a-1707-4b41-89e9-563b51024eed" (UID: "f3337c2a-1707-4b41-89e9-563b51024eed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.413132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.413989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3337c2a-1707-4b41-89e9-563b51024eed" (UID: "f3337c2a-1707-4b41-89e9-563b51024eed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.422493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3337c2a-1707-4b41-89e9-563b51024eed" (UID: "f3337c2a-1707-4b41-89e9-563b51024eed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459582 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459911 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459928 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459941 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459952 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492vf\" (UniqueName: \"kubernetes.io/projected/10d49525-ec1b-4c52-8221-f3f0bb57e574-kube-api-access-492vf\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459962 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459969 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459978 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wskq\" (UniqueName: \"kubernetes.io/projected/f3337c2a-1707-4b41-89e9-563b51024eed-kube-api-access-5wskq\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.459985 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3337c2a-1707-4b41-89e9-563b51024eed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.476143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data" (OuterVolumeSpecName: "config-data") pod "10d49525-ec1b-4c52-8221-f3f0bb57e574" (UID: "10d49525-ec1b-4c52-8221-f3f0bb57e574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.561367 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d49525-ec1b-4c52-8221-f3f0bb57e574-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.696942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" event={"ID":"76cf4acb-9763-4dac-9a2f-eba4a98314f0","Type":"ContainerStarted","Data":"98b9af316e0c035e2e44df831b61ee43802934b5d38c21ae821f591b0b5fd59c"} Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.699051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" event={"ID":"f3337c2a-1707-4b41-89e9-563b51024eed","Type":"ContainerDied","Data":"0c0890a5e49de23ee3ed2bf0df58f726c4722d8fc901c7fb6b57d036c4232830"} Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.699086 4763 scope.go:117] "RemoveContainer" containerID="09c8a182a6da642916653c1fc6883d8217bfa23bb44563453c43788ae8363682" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.699283 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798d46d59c-2cbs9" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.701152 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5446b6d8dc-p784q" event={"ID":"d23af5c6-295f-4c65-90a1-02e66a41f325","Type":"ContainerStarted","Data":"0d9843fcba4e4f38b770cf94692d0c58cbb0dc95c4f2a90a7517c61471d434e3"} Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.704164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cprq9" event={"ID":"10d49525-ec1b-4c52-8221-f3f0bb57e574","Type":"ContainerDied","Data":"ace80f83bd0bec90e206ccf07a9f4b0ac704e7dfebde227d2c080f8f560d9288"} Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.704213 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ace80f83bd0bec90e206ccf07a9f4b0ac704e7dfebde227d2c080f8f560d9288" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.704184 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cprq9" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.708754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" event={"ID":"c0bb3232-022f-4f19-89e7-374ae90d4dd3","Type":"ContainerStarted","Data":"b3c88c8e38fddbe7f43cbad2277a175718fff5866ff93369053aa2a4c73d2430"} Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.709263 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.729345 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" podStartSLOduration=12.729324858 podStartE2EDuration="12.729324858s" podCreationTimestamp="2025-12-05 12:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:45.724837576 +0000 UTC m=+1330.217552329" watchObservedRunningTime="2025-12-05 12:10:45.729324858 +0000 UTC m=+1330.222039581" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.801879 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff85a7c-5f5f-4c46-97c7-083ff89eb14c" path="/var/lib/kubelet/pods/aff85a7c-5f5f-4c46-97c7-083ff89eb14c/volumes" Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.859679 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 05 12:10:45 crc kubenswrapper[4763]: I1205 12:10:45.992137 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.008963 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.012853 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:10:46 crc kubenswrapper[4763]: E1205 12:10:46.013264 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" containerName="cinder-db-sync" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.013282 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" containerName="cinder-db-sync" Dec 05 12:10:46 crc kubenswrapper[4763]: E1205 12:10:46.013316 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3337c2a-1707-4b41-89e9-563b51024eed" containerName="init" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.013322 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3337c2a-1707-4b41-89e9-563b51024eed" containerName="init" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.013536 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3337c2a-1707-4b41-89e9-563b51024eed" containerName="init" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.013555 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" containerName="cinder-db-sync" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.014523 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.021446 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.021729 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-64ztw" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.021920 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.022140 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.037687 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-798d46d59c-2cbs9"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.077836 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.085907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.085995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.086086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xbf\" (UniqueName: \"kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.086231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.086337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.086492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.108591 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.126250 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.130337 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.149859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.192958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xbf\" (UniqueName: \"kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193163 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gcmc\" (UniqueName: \"kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.193364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.195001 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.197750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.202012 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.208172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.214308 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.219911 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xbf\" (UniqueName: \"kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf\") pod \"cinder-scheduler-0\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.295605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.296856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.297330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.297499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gcmc\" (UniqueName: \"kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.297989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.308074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.301756 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.308218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.308331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.309137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.309397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.316469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gcmc\" (UniqueName: \"kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc\") pod \"dnsmasq-dns-6578955fd5-zhrbr\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.318801 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.322232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.331679 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.339387 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.363601 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411380 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411485 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411747 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411872 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5q64\" (UniqueName: \"kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.411911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.483478 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5q64\" (UniqueName: \"kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.513961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.514115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.515742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.518311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.518393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.525144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.525280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.540155 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5q64\" (UniqueName: \"kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64\") pod \"cinder-api-0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.627412 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.676240 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.734883 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" event={"ID":"76cf4acb-9763-4dac-9a2f-eba4a98314f0","Type":"ContainerStarted","Data":"393a56e90f299fbc3f7324d200e9e3d401719bd5ed61dc4df2db47a38ac91e5d"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.763575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5446b6d8dc-p784q" event={"ID":"d23af5c6-295f-4c65-90a1-02e66a41f325","Type":"ContainerStarted","Data":"9a8dbab818206952335e5bf1c88516148fb52963df3e9da62ed0570626fc3ef3"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.769260 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-94d865894-tqt5m" podStartSLOduration=11.907811212 podStartE2EDuration="13.769239663s" podCreationTimestamp="2025-12-05 12:10:33 +0000 UTC" firstStartedPulling="2025-12-05 12:10:43.460532731 +0000 UTC m=+1327.953247454" lastFinishedPulling="2025-12-05 12:10:45.321961182 +0000 UTC m=+1329.814675905" observedRunningTime="2025-12-05 12:10:46.759565961 +0000 UTC m=+1331.252280694" watchObservedRunningTime="2025-12-05 12:10:46.769239663 +0000 UTC m=+1331.261954386" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783157 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5446b6d8dc-p784q" podStartSLOduration=11.790415278 podStartE2EDuration="13.783138918s" podCreationTimestamp="2025-12-05 12:10:33 +0000 UTC" firstStartedPulling="2025-12-05 12:10:43.36117429 +0000 UTC m=+1327.853889013" lastFinishedPulling="2025-12-05 12:10:45.35389793 +0000 UTC m=+1329.846612653" observedRunningTime="2025-12-05 12:10:46.781417722 +0000 UTC m=+1331.274132465" watchObservedRunningTime="2025-12-05 12:10:46.783138918 +0000 UTC m=+1331.275853641" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783456 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerID="6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444" exitCode=0 Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerDied","Data":"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8f26ac-3463-4d42-936d-420cfdbd81eb","Type":"ContainerDied","Data":"4a61f0da24f297ec4f874174ec449d5d0a457807bfcf25f4e852279f2dd871a4"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783555 4763 scope.go:117] "RemoveContainer" containerID="2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.783694 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.796570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"37a4b06b-53bd-4f53-89b7-4d5a53554510","Type":"ContainerStarted","Data":"69b49f91a70367285afd2dce14cf495efd3f2fb15420d063f8b332c6c52b4e0a"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.796610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"37a4b06b-53bd-4f53-89b7-4d5a53554510","Type":"ContainerStarted","Data":"8163d0e97e388d5775db2bb318d72c9c19b200c36456e5c5a2dc995c2ba81f9d"} Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823563 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8j52\" (UniqueName: \"kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823744 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823803 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823879 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.823915 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.824014 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data\") pod \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\" (UID: \"bb8f26ac-3463-4d42-936d-420cfdbd81eb\") " Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.833849 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52" (OuterVolumeSpecName: "kube-api-access-p8j52") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "kube-api-access-p8j52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.839023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts" (OuterVolumeSpecName: "scripts") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.841495 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.841645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.918949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.925408 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.925443 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.925452 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.925461 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8f26ac-3463-4d42-936d-420cfdbd81eb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.925470 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8j52\" (UniqueName: \"kubernetes.io/projected/bb8f26ac-3463-4d42-936d-420cfdbd81eb-kube-api-access-p8j52\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:46 crc kubenswrapper[4763]: I1205 12:10:46.964904 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.015742 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data" (OuterVolumeSpecName: "config-data") pod "bb8f26ac-3463-4d42-936d-420cfdbd81eb" (UID: "bb8f26ac-3463-4d42-936d-420cfdbd81eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.028653 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.028683 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8f26ac-3463-4d42-936d-420cfdbd81eb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.028911 4763 scope.go:117] "RemoveContainer" containerID="cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.039568 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.092481 4763 scope.go:117] "RemoveContainer" containerID="6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.198403 4763 scope.go:117] "RemoveContainer" containerID="2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.200045 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.214866 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8\": container with ID starting with 2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8 not found: ID does not exist" containerID="2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.214948 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8"} err="failed to get container status \"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8\": rpc error: code = NotFound desc = could not find container \"2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8\": container with ID starting with 2708c02aa6c94c8ebbc4f34ea05d11156b8b1f234aaabb64b75566bbd315c4f8 not found: ID does not exist" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.215025 4763 scope.go:117] "RemoveContainer" containerID="cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27" Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.216120 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27\": container with ID starting with cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27 not found: ID does not exist" containerID="cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.216146 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27"} err="failed to get container status \"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27\": rpc error: code = NotFound desc = could not find container \"cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27\": container with ID starting with cb18a1decab54a7f41cb3430a7386394631bc0be22d0f9fada331709b8bf6e27 not found: ID does not exist" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.216173 4763 scope.go:117] "RemoveContainer" containerID="6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444" Dec 05 12:10:47 crc kubenswrapper[4763]: W1205 12:10:47.216232 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a2c6ddd_063a_4531_9458_9de82a61d9ed.slice/crio-fdba6952113c6c55be16e4919e6e7ca3dcc7b9d95e1e783c3e151e7634c2c508 WatchSource:0}: Error finding container fdba6952113c6c55be16e4919e6e7ca3dcc7b9d95e1e783c3e151e7634c2c508: Status 404 returned error can't find the container with id fdba6952113c6c55be16e4919e6e7ca3dcc7b9d95e1e783c3e151e7634c2c508 Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.223597 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.224728 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444\": container with ID starting with 6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444 not found: ID does not exist" containerID="6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.224783 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444"} err="failed to get container status \"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444\": rpc error: code = NotFound desc = could not find container \"6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444\": container with ID starting with 6311e623b93230006c2447f0f3b5835a403e0ccf7dfd01be4e2487d59d728444 not found: ID does not exist" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.235445 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.265468 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.265904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="sg-core" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.265925 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="sg-core" Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.265959 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="ceilometer-notification-agent" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.265966 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="ceilometer-notification-agent" Dec 05 12:10:47 crc kubenswrapper[4763]: E1205 12:10:47.265980 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="proxy-httpd" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.266008 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="proxy-httpd" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.266219 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="ceilometer-notification-agent" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.266238 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="sg-core" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.266262 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" containerName="proxy-httpd" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.285455 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.285664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.291087 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.291132 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.335590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.335705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j89x\" (UniqueName: \"kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.335745 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.335823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.336122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.336151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.336377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.440828 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.440889 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.440929 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j89x\" (UniqueName: \"kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.440978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.441038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.441062 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.441088 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.441583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.449401 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.453195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.462529 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.467381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.468070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.481566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.486250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j89x\" (UniqueName: \"kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x\") pod \"ceilometer-0\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.605260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.862662 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8f26ac-3463-4d42-936d-420cfdbd81eb" path="/var/lib/kubelet/pods/bb8f26ac-3463-4d42-936d-420cfdbd81eb/volumes" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.864089 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3337c2a-1707-4b41-89e9-563b51024eed" path="/var/lib/kubelet/pods/f3337c2a-1707-4b41-89e9-563b51024eed/volumes" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.893514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerStarted","Data":"a4fd7a6e66018d9d90f720d3264d350af946c22eeb8471f2208d33fcfb269ab2"} Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.899688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerStarted","Data":"1099720c70560833064165d03be2a2554bfcc0eab853f67ac6f0cb306c08b4f8"} Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.906886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"37a4b06b-53bd-4f53-89b7-4d5a53554510","Type":"ContainerStarted","Data":"33f24acaff3de98ed107cb2926bd34ad5d3e0f7a7970096fe932302f00533700"} Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.907319 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.921390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerStarted","Data":"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42"} Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.921704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerStarted","Data":"fdba6952113c6c55be16e4919e6e7ca3dcc7b9d95e1e783c3e151e7634c2c508"} Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.923061 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="dnsmasq-dns" containerID="cri-o://b3c88c8e38fddbe7f43cbad2277a175718fff5866ff93369053aa2a4c73d2430" gracePeriod=10 Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.934481 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.934459698 podStartE2EDuration="3.934459698s" podCreationTimestamp="2025-12-05 12:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:47.930548617 +0000 UTC m=+1332.423263350" watchObservedRunningTime="2025-12-05 12:10:47.934459698 +0000 UTC m=+1332.427174421" Dec 05 12:10:47 crc kubenswrapper[4763]: I1205 12:10:47.977196 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b55c974d9-brgnw" Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.096072 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.096284 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dc5f79d94-t8x4q" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-api" containerID="cri-o://567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268" gracePeriod=30 Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.096639 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dc5f79d94-t8x4q" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-httpd" containerID="cri-o://209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f" gracePeriod=30 Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.192575 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.257525 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.587754 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.836883 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76d66464d-r24j6" Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.953440 4763 generic.go:334] "Generic (PLEG): container finished" podID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerID="b3c88c8e38fddbe7f43cbad2277a175718fff5866ff93369053aa2a4c73d2430" exitCode=0 Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.953496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" event={"ID":"c0bb3232-022f-4f19-89e7-374ae90d4dd3","Type":"ContainerDied","Data":"b3c88c8e38fddbe7f43cbad2277a175718fff5866ff93369053aa2a4c73d2430"} Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.955296 4763 generic.go:334] "Generic (PLEG): container finished" podID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerID="209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f" exitCode=0 Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.955336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerDied","Data":"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f"} Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.956315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerStarted","Data":"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5"} Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.957978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerStarted","Data":"8b60a8f3eef0de9d1e54359599955cbc899aa128ab2f5f5227fb383fe151f46f"} Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.959380 4763 generic.go:334] "Generic (PLEG): container finished" podID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerID="039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42" exitCode=0 Dec 05 12:10:48 crc kubenswrapper[4763]: I1205 12:10:48.960563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerDied","Data":"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42"} Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.163854 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.211731 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnvgv\" (UniqueName: \"kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.218484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.218645 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.218729 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.218793 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.218876 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.231662 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv" (OuterVolumeSpecName: "kube-api-access-mnvgv") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "kube-api-access-mnvgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.311542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.321161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config" (OuterVolumeSpecName: "config") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.321550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") pod \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\" (UID: \"c0bb3232-022f-4f19-89e7-374ae90d4dd3\") " Dec 05 12:10:49 crc kubenswrapper[4763]: W1205 12:10:49.321754 4763 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c0bb3232-022f-4f19-89e7-374ae90d4dd3/volumes/kubernetes.io~configmap/config Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.321872 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config" (OuterVolumeSpecName: "config") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.322268 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.322292 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnvgv\" (UniqueName: \"kubernetes.io/projected/c0bb3232-022f-4f19-89e7-374ae90d4dd3-kube-api-access-mnvgv\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.322302 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.328383 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.357097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.364303 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0bb3232-022f-4f19-89e7-374ae90d4dd3" (UID: "c0bb3232-022f-4f19-89e7-374ae90d4dd3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.453891 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.453927 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.453937 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bb3232-022f-4f19-89e7-374ae90d4dd3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.979173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" event={"ID":"c0bb3232-022f-4f19-89e7-374ae90d4dd3","Type":"ContainerDied","Data":"c8dd3b9278d68e7e36e46f19835d9da557919f85e6e0bec979c36bbc5f19758c"} Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.979462 4763 scope.go:117] "RemoveContainer" containerID="b3c88c8e38fddbe7f43cbad2277a175718fff5866ff93369053aa2a4c73d2430" Dec 05 12:10:49 crc kubenswrapper[4763]: I1205 12:10:49.979578 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-ssc9w" Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.002191 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerStarted","Data":"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714"} Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.002966 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.033619 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.040727 4763 scope.go:117] "RemoveContainer" containerID="29965db590b623057df986dbdec1f58d92427771f1b8c9748fe4d09396c955f1" Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.054945 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-ssc9w"] Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.060676 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" podStartSLOduration=4.06065809 podStartE2EDuration="4.06065809s" podCreationTimestamp="2025-12-05 12:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:50.025341312 +0000 UTC m=+1334.518056035" watchObservedRunningTime="2025-12-05 12:10:50.06065809 +0000 UTC m=+1334.553372813" Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.083801 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 05 12:10:50 crc kubenswrapper[4763]: I1205 12:10:50.083892 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.106403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerStarted","Data":"fc45c0c995101a2b2c673c31de378857293cbacfe513c30d5d4749e3fe22eeb4"} Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.135912 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerStarted","Data":"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14"} Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.136079 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api-log" containerID="cri-o://112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" gracePeriod=30 Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.136363 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.136616 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api" containerID="cri-o://7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" gracePeriod=30 Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.174574 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.174556421 podStartE2EDuration="5.174556421s" podCreationTimestamp="2025-12-05 12:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:51.169959024 +0000 UTC m=+1335.662673757" watchObservedRunningTime="2025-12-05 12:10:51.174556421 +0000 UTC m=+1335.667271144" Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.184862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerStarted","Data":"2a493d78b9a0e23dc0ad9607673518f834848814c3799e79e9615799e013155f"} Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.425872 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.819550 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" path="/var/lib/kubelet/pods/c0bb3232-022f-4f19-89e7-374ae90d4dd3/volumes" Dec 05 12:10:51 crc kubenswrapper[4763]: I1205 12:10:51.976517 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.072534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle\") pod \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.072606 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zklxb\" (UniqueName: \"kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb\") pod \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.072635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs\") pod \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.072890 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config\") pod \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.072925 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config\") pod \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\" (UID: \"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.097829 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" (UID: "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.101022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb" (OuterVolumeSpecName: "kube-api-access-zklxb") pod "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" (UID: "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f"). InnerVolumeSpecName "kube-api-access-zklxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.154969 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config" (OuterVolumeSpecName: "config") pod "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" (UID: "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.171197 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" (UID: "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.174888 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.174951 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.174968 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.175041 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zklxb\" (UniqueName: \"kubernetes.io/projected/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-kube-api-access-zklxb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.193440 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.203593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerStarted","Data":"3481ca6aba7c67025553d2c18bf1c91ec9bda5e34b1d058c0826eee35a353bd4"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.207855 4763 generic.go:334] "Generic (PLEG): container finished" podID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerID="567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268" exitCode=0 Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.207961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerDied","Data":"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.207996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc5f79d94-t8x4q" event={"ID":"fcd7679a-a9b6-4721-8cb1-07f7c801bb3f","Type":"ContainerDied","Data":"7fcd4abc441e2d440af48727aa27282f7a556dcfa12259cafce378f50de07249"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.208020 4763 scope.go:117] "RemoveContainer" containerID="209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.215117 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc5f79d94-t8x4q" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.223247 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" (UID: "fcd7679a-a9b6-4721-8cb1-07f7c801bb3f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241075 4763 generic.go:334] "Generic (PLEG): container finished" podID="b12efc00-7700-4732-a28c-23ade5c440f0" containerID="7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" exitCode=0 Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241108 4763 generic.go:334] "Generic (PLEG): container finished" podID="b12efc00-7700-4732-a28c-23ade5c440f0" containerID="112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" exitCode=143 Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241147 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerDied","Data":"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerDied","Data":"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b12efc00-7700-4732-a28c-23ade5c440f0","Type":"ContainerDied","Data":"1099720c70560833064165d03be2a2554bfcc0eab853f67ac6f0cb306c08b4f8"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.241239 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.248588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerStarted","Data":"3544d9156ebffb971fcc4e7d83f5f8fa0df0a4c1c8f6f691ce005c667e3e2467"} Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.259875 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.694550909 podStartE2EDuration="7.259854251s" podCreationTimestamp="2025-12-05 12:10:45 +0000 UTC" firstStartedPulling="2025-12-05 12:10:47.092501656 +0000 UTC m=+1331.585216379" lastFinishedPulling="2025-12-05 12:10:48.657804998 +0000 UTC m=+1333.150519721" observedRunningTime="2025-12-05 12:10:52.233881652 +0000 UTC m=+1336.726596366" watchObservedRunningTime="2025-12-05 12:10:52.259854251 +0000 UTC m=+1336.752568974" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.268860 4763 scope.go:117] "RemoveContainer" containerID="567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.278229 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.298996 4763 scope.go:117] "RemoveContainer" containerID="209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.304909 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f\": container with ID starting with 209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f not found: ID does not exist" containerID="209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.304955 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f"} err="failed to get container status \"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f\": rpc error: code = NotFound desc = could not find container \"209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f\": container with ID starting with 209c9293f638e841d39c533fc219d880b52fb2265a1403645f5bb95bcaaf6e0f not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.304991 4763 scope.go:117] "RemoveContainer" containerID="567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.308944 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268\": container with ID starting with 567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268 not found: ID does not exist" containerID="567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.308991 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268"} err="failed to get container status \"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268\": rpc error: code = NotFound desc = could not find container \"567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268\": container with ID starting with 567a23c307a2d18bf986c0390a9b54938c20c27b0172b71e72ea68aab161a268 not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.309062 4763 scope.go:117] "RemoveContainer" containerID="7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.351074 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.367019 4763 scope.go:117] "RemoveContainer" containerID="112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379609 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379669 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379731 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379884 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.379975 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5q64\" (UniqueName: \"kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64\") pod \"b12efc00-7700-4732-a28c-23ade5c440f0\" (UID: \"b12efc00-7700-4732-a28c-23ade5c440f0\") " Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.382221 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.385857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs" (OuterVolumeSpecName: "logs") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.388952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts" (OuterVolumeSpecName: "scripts") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.391933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64" (OuterVolumeSpecName: "kube-api-access-r5q64") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "kube-api-access-r5q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.396564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.447931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data" (OuterVolumeSpecName: "config-data") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.455627 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12efc00-7700-4732-a28c-23ade5c440f0" (UID: "b12efc00-7700-4732-a28c-23ade5c440f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.471398 4763 scope.go:117] "RemoveContainer" containerID="7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.471787 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14\": container with ID starting with 7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14 not found: ID does not exist" containerID="7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.471816 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14"} err="failed to get container status \"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14\": rpc error: code = NotFound desc = could not find container \"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14\": container with ID starting with 7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14 not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.471837 4763 scope.go:117] "RemoveContainer" containerID="112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.472004 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5\": container with ID starting with 112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5 not found: ID does not exist" containerID="112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.472024 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5"} err="failed to get container status \"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5\": rpc error: code = NotFound desc = could not find container \"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5\": container with ID starting with 112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5 not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.472037 4763 scope.go:117] "RemoveContainer" containerID="7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.472199 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14"} err="failed to get container status \"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14\": rpc error: code = NotFound desc = could not find container \"7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14\": container with ID starting with 7d7e4d0d4d39087ecc61c8fdf5bcb002a06c700aed6e7e76351bf0fb0bfe1a14 not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.472223 4763 scope.go:117] "RemoveContainer" containerID="112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.472379 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5"} err="failed to get container status \"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5\": rpc error: code = NotFound desc = could not find container \"112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5\": container with ID starting with 112ea07c04d8ad541b899f0e03c1a1353b0eb2b32d4118fc8f4fac5b0ea6d3f5 not found: ID does not exist" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482111 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482144 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482154 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5q64\" (UniqueName: \"kubernetes.io/projected/b12efc00-7700-4732-a28c-23ade5c440f0-kube-api-access-r5q64\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482163 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12efc00-7700-4732-a28c-23ade5c440f0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482171 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12efc00-7700-4732-a28c-23ade5c440f0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482180 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.482189 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12efc00-7700-4732-a28c-23ade5c440f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.562323 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.588616 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dc5f79d94-t8x4q"] Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.595895 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.606888 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.620493 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.620956 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-api" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.620977 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-api" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.620988 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="init" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.620995 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="init" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.621003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="dnsmasq-dns" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621010 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="dnsmasq-dns" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.621034 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-httpd" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621042 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-httpd" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.621052 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api-log" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621059 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api-log" Dec 05 12:10:52 crc kubenswrapper[4763]: E1205 12:10:52.621079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621086 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621290 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bb3232-022f-4f19-89e7-374ae90d4dd3" containerName="dnsmasq-dns" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621306 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-api" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621317 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api-log" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621338 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" containerName="neutron-httpd" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.621348 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" containerName="cinder-api" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.622528 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.622642 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.630807 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.631225 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.631795 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.677561 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791108 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4027bf13-4c83-4281-8a0d-d18c6032e0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzfrm\" (UniqueName: \"kubernetes.io/projected/4027bf13-4c83-4281-8a0d-d18c6032e0af-kube-api-access-hzfrm\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-scripts\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791861 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.791940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4027bf13-4c83-4281-8a0d-d18c6032e0af-logs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.792045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.893993 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4027bf13-4c83-4281-8a0d-d18c6032e0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzfrm\" (UniqueName: \"kubernetes.io/projected/4027bf13-4c83-4281-8a0d-d18c6032e0af-kube-api-access-hzfrm\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-scripts\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894196 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894283 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4027bf13-4c83-4281-8a0d-d18c6032e0af-logs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4027bf13-4c83-4281-8a0d-d18c6032e0af-logs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.894832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4027bf13-4c83-4281-8a0d-d18c6032e0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.902380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-scripts\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.903351 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.903915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.905169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.906528 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-config-data\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.908190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4027bf13-4c83-4281-8a0d-d18c6032e0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.916345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzfrm\" (UniqueName: \"kubernetes.io/projected/4027bf13-4c83-4281-8a0d-d18c6032e0af-kube-api-access-hzfrm\") pod \"cinder-api-0\" (UID: \"4027bf13-4c83-4281-8a0d-d18c6032e0af\") " pod="openstack/cinder-api-0" Dec 05 12:10:52 crc kubenswrapper[4763]: I1205 12:10:52.964020 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 12:10:53 crc kubenswrapper[4763]: I1205 12:10:53.330026 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerStarted","Data":"7a5aa79d6ff40c4c2bd4586bfcb50c18f7922c73abe58623b83e4277778fb853"} Dec 05 12:10:53 crc kubenswrapper[4763]: I1205 12:10:53.571872 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 12:10:53 crc kubenswrapper[4763]: W1205 12:10:53.575142 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4027bf13_4c83_4281_8a0d_d18c6032e0af.slice/crio-865ebbf81be2d4719305febfbe61ff716105368ba293a89ac7317ea5695e5928 WatchSource:0}: Error finding container 865ebbf81be2d4719305febfbe61ff716105368ba293a89ac7317ea5695e5928: Status 404 returned error can't find the container with id 865ebbf81be2d4719305febfbe61ff716105368ba293a89ac7317ea5695e5928 Dec 05 12:10:53 crc kubenswrapper[4763]: I1205 12:10:53.797339 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12efc00-7700-4732-a28c-23ade5c440f0" path="/var/lib/kubelet/pods/b12efc00-7700-4732-a28c-23ade5c440f0/volumes" Dec 05 12:10:53 crc kubenswrapper[4763]: I1205 12:10:53.798342 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd7679a-a9b6-4721-8cb1-07f7c801bb3f" path="/var/lib/kubelet/pods/fcd7679a-a9b6-4721-8cb1-07f7c801bb3f/volumes" Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.352991 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4027bf13-4c83-4281-8a0d-d18c6032e0af","Type":"ContainerStarted","Data":"865ebbf81be2d4719305febfbe61ff716105368ba293a89ac7317ea5695e5928"} Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.471084 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.549191 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f846788f8-4gznp" Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.627637 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.627953 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" containerID="cri-o://6e8f58885bc169f9772af5c4bc947eef4345bfb60959ae68f4f25616b6735a1d" gracePeriod=30 Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.628128 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" containerID="cri-o://230cb231d868037183ea0f6909557c737de8a2c4c2c301798612f96ca1075a0e" gracePeriod=30 Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.638583 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.638807 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Dec 05 12:10:54 crc kubenswrapper[4763]: I1205 12:10:54.638909 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.082751 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.092050 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.394125 4763 generic.go:334] "Generic (PLEG): container finished" podID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerID="6e8f58885bc169f9772af5c4bc947eef4345bfb60959ae68f4f25616b6735a1d" exitCode=143 Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.394178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerDied","Data":"6e8f58885bc169f9772af5c4bc947eef4345bfb60959ae68f4f25616b6735a1d"} Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.406390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerStarted","Data":"d20e07f05eb3671b031be2015a996cff0a7d84bd4ade390b40f6663b9fae8637"} Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.406504 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.417121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4027bf13-4c83-4281-8a0d-d18c6032e0af","Type":"ContainerStarted","Data":"fa000e03b9704be65421bae2835c46d3b00e0b038eb97dd2de2ab141f0d0c4f6"} Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.433554 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.484280374 podStartE2EDuration="8.43353566s" podCreationTimestamp="2025-12-05 12:10:47 +0000 UTC" firstStartedPulling="2025-12-05 12:10:48.458018874 +0000 UTC m=+1332.950733597" lastFinishedPulling="2025-12-05 12:10:54.40727416 +0000 UTC m=+1338.899988883" observedRunningTime="2025-12-05 12:10:55.42598054 +0000 UTC m=+1339.918695263" watchObservedRunningTime="2025-12-05 12:10:55.43353566 +0000 UTC m=+1339.926250383" Dec 05 12:10:55 crc kubenswrapper[4763]: I1205 12:10:55.434129 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.365076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.428913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4027bf13-4c83-4281-8a0d-d18c6032e0af","Type":"ContainerStarted","Data":"ea66adf540326983d08442e7ee8fba95f7392761cb5e2eae265a758d963dd9f5"} Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.469859 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.4698360059999995 podStartE2EDuration="4.469836006s" podCreationTimestamp="2025-12-05 12:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:10:56.450876766 +0000 UTC m=+1340.943591529" watchObservedRunningTime="2025-12-05 12:10:56.469836006 +0000 UTC m=+1340.962550729" Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.486330 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.569857 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.570088 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="dnsmasq-dns" containerID="cri-o://2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a" gracePeriod=10 Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.728108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 12:10:56 crc kubenswrapper[4763]: I1205 12:10:56.783371 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.217372 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.312969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xr6b\" (UniqueName: \"kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b\") pod \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.313030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc\") pod \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.313067 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb\") pod \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.313081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb\") pod \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.313201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config\") pod \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\" (UID: \"7d2c0217-45ff-4c92-af09-ece49c97a9d4\") " Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.333976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b" (OuterVolumeSpecName: "kube-api-access-4xr6b") pod "7d2c0217-45ff-4c92-af09-ece49c97a9d4" (UID: "7d2c0217-45ff-4c92-af09-ece49c97a9d4"). InnerVolumeSpecName "kube-api-access-4xr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.371513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d2c0217-45ff-4c92-af09-ece49c97a9d4" (UID: "7d2c0217-45ff-4c92-af09-ece49c97a9d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.372523 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config" (OuterVolumeSpecName: "config") pod "7d2c0217-45ff-4c92-af09-ece49c97a9d4" (UID: "7d2c0217-45ff-4c92-af09-ece49c97a9d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.388602 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d2c0217-45ff-4c92-af09-ece49c97a9d4" (UID: "7d2c0217-45ff-4c92-af09-ece49c97a9d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.398427 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d2c0217-45ff-4c92-af09-ece49c97a9d4" (UID: "7d2c0217-45ff-4c92-af09-ece49c97a9d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.415999 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xr6b\" (UniqueName: \"kubernetes.io/projected/7d2c0217-45ff-4c92-af09-ece49c97a9d4-kube-api-access-4xr6b\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.416047 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.416065 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.416078 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.416089 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c0217-45ff-4c92-af09-ece49c97a9d4-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.447269 4763 generic.go:334] "Generic (PLEG): container finished" podID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerID="2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a" exitCode=0 Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.447419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerDied","Data":"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a"} Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.449629 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.449694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" event={"ID":"7d2c0217-45ff-4c92-af09-ece49c97a9d4","Type":"ContainerDied","Data":"ccc29b8081230c4b9c91f1aeb1da72b754b9f20ba80aae41cea0993c29c3f396"} Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.447523 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-ls64w" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.449858 4763 scope.go:117] "RemoveContainer" containerID="2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.449924 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="cinder-scheduler" containerID="cri-o://fc45c0c995101a2b2c673c31de378857293cbacfe513c30d5d4749e3fe22eeb4" gracePeriod=30 Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.450095 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="probe" containerID="cri-o://3481ca6aba7c67025553d2c18bf1c91ec9bda5e34b1d058c0826eee35a353bd4" gracePeriod=30 Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.480608 4763 scope.go:117] "RemoveContainer" containerID="cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.493418 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.504422 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-ls64w"] Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.518851 4763 scope.go:117] "RemoveContainer" containerID="2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a" Dec 05 12:10:57 crc kubenswrapper[4763]: E1205 12:10:57.523188 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a\": container with ID starting with 2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a not found: ID does not exist" containerID="2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.523330 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a"} err="failed to get container status \"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a\": rpc error: code = NotFound desc = could not find container \"2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a\": container with ID starting with 2b66e805ebbceef3dbf72b749e8693b9cb4b5b5fa94f85b043a6663092ac559a not found: ID does not exist" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.523480 4763 scope.go:117] "RemoveContainer" containerID="cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28" Dec 05 12:10:57 crc kubenswrapper[4763]: E1205 12:10:57.524109 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28\": container with ID starting with cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28 not found: ID does not exist" containerID="cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.524128 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28"} err="failed to get container status \"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28\": rpc error: code = NotFound desc = could not find container \"cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28\": container with ID starting with cfe5c8df06903750fd05c4152f6d6bccd9da1375abf00329b72234dcbf0cac28 not found: ID does not exist" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.680211 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7fc67b9475-mqldq" Dec 05 12:10:57 crc kubenswrapper[4763]: I1205 12:10:57.801571 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" path="/var/lib/kubelet/pods/7d2c0217-45ff-4c92-af09-ece49c97a9d4/volumes" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.831572 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:58 crc kubenswrapper[4763]: E1205 12:10:58.832380 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="init" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.832399 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="init" Dec 05 12:10:58 crc kubenswrapper[4763]: E1205 12:10:58.832419 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="dnsmasq-dns" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.832428 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="dnsmasq-dns" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.832652 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2c0217-45ff-4c92-af09-ece49c97a9d4" containerName="dnsmasq-dns" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.833501 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.835562 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.841993 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.843008 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9bdkd" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.867666 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.955492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.955588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.955760 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjn4c\" (UniqueName: \"kubernetes.io/projected/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-kube-api-access-zjn4c\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:58 crc kubenswrapper[4763]: I1205 12:10:58.955842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.020846 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:59 crc kubenswrapper[4763]: E1205 12:10:59.021671 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-zjn4c openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="60db4a8b-41c1-4ac3-be94-54eedb9fffc5" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.029300 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.058730 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.060247 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.062867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.062998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.063084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjn4c\" (UniqueName: \"kubernetes.io/projected/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-kube-api-access-zjn4c\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.063112 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.063800 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: E1205 12:10:59.064879 4763 projected.go:194] Error preparing data for projected volume kube-api-access-zjn4c for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (60db4a8b-41c1-4ac3-be94-54eedb9fffc5) does not match the UID in record. The object might have been deleted and then recreated Dec 05 12:10:59 crc kubenswrapper[4763]: E1205 12:10:59.064954 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-kube-api-access-zjn4c podName:60db4a8b-41c1-4ac3-be94-54eedb9fffc5 nodeName:}" failed. No retries permitted until 2025-12-05 12:10:59.564938453 +0000 UTC m=+1344.057653176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zjn4c" (UniqueName: "kubernetes.io/projected/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-kube-api-access-zjn4c") pod "openstackclient" (UID: "60db4a8b-41c1-4ac3-be94-54eedb9fffc5") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (60db4a8b-41c1-4ac3-be94-54eedb9fffc5) does not match the UID in record. The object might have been deleted and then recreated Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.070967 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.072498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.074378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret\") pod \"openstackclient\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.164970 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.165155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.165242 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkr7t\" (UniqueName: \"kubernetes.io/projected/9e894c53-51db-4ede-9730-b8c68ad6fc15-kube-api-access-xkr7t\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.165384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.267547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.267889 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.267954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkr7t\" (UniqueName: \"kubernetes.io/projected/9e894c53-51db-4ede-9730-b8c68ad6fc15-kube-api-access-xkr7t\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.268035 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.268659 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.274726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.275036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e894c53-51db-4ede-9730-b8c68ad6fc15-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.296536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkr7t\" (UniqueName: \"kubernetes.io/projected/9e894c53-51db-4ede-9730-b8c68ad6fc15-kube-api-access-xkr7t\") pod \"openstackclient\" (UID: \"9e894c53-51db-4ede-9730-b8c68ad6fc15\") " pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.468766 4763 generic.go:334] "Generic (PLEG): container finished" podID="e387c1cd-02b0-40cc-a958-2443971ae373" containerID="3481ca6aba7c67025553d2c18bf1c91ec9bda5e34b1d058c0826eee35a353bd4" exitCode=0 Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.468796 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerDied","Data":"3481ca6aba7c67025553d2c18bf1c91ec9bda5e34b1d058c0826eee35a353bd4"} Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.468842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerDied","Data":"fc45c0c995101a2b2c673c31de378857293cbacfe513c30d5d4749e3fe22eeb4"} Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.468814 4763 generic.go:334] "Generic (PLEG): container finished" podID="e387c1cd-02b0-40cc-a958-2443971ae373" containerID="fc45c0c995101a2b2c673c31de378857293cbacfe513c30d5d4749e3fe22eeb4" exitCode=0 Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.468902 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.477714 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="60db4a8b-41c1-4ac3-be94-54eedb9fffc5" podUID="9e894c53-51db-4ede-9730-b8c68ad6fc15" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.482303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.518015 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.557526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574519 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57xbf\" (UniqueName: \"kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574599 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574658 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config\") pod \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574746 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574854 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id\") pod \"e387c1cd-02b0-40cc-a958-2443971ae373\" (UID: \"e387c1cd-02b0-40cc-a958-2443971ae373\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle\") pod \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.574957 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret\") pod \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\" (UID: \"60db4a8b-41c1-4ac3-be94-54eedb9fffc5\") " Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.575431 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjn4c\" (UniqueName: \"kubernetes.io/projected/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-kube-api-access-zjn4c\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.576666 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "60db4a8b-41c1-4ac3-be94-54eedb9fffc5" (UID: "60db4a8b-41c1-4ac3-be94-54eedb9fffc5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.579575 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "60db4a8b-41c1-4ac3-be94-54eedb9fffc5" (UID: "60db4a8b-41c1-4ac3-be94-54eedb9fffc5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.579627 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.582069 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf" (OuterVolumeSpecName: "kube-api-access-57xbf") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "kube-api-access-57xbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.582746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts" (OuterVolumeSpecName: "scripts") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.588898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.589063 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60db4a8b-41c1-4ac3-be94-54eedb9fffc5" (UID: "60db4a8b-41c1-4ac3-be94-54eedb9fffc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.644353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677240 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677621 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57xbf\" (UniqueName: \"kubernetes.io/projected/e387c1cd-02b0-40cc-a958-2443971ae373-kube-api-access-57xbf\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677670 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677680 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677689 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677699 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e387c1cd-02b0-40cc-a958-2443971ae373-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677708 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.677716 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60db4a8b-41c1-4ac3-be94-54eedb9fffc5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.681826 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.696025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data" (OuterVolumeSpecName: "config-data") pod "e387c1cd-02b0-40cc-a958-2443971ae373" (UID: "e387c1cd-02b0-40cc-a958-2443971ae373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.781077 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e387c1cd-02b0-40cc-a958-2443971ae373-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:10:59 crc kubenswrapper[4763]: I1205 12:10:59.797147 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60db4a8b-41c1-4ac3-be94-54eedb9fffc5" path="/var/lib/kubelet/pods/60db4a8b-41c1-4ac3-be94-54eedb9fffc5/volumes" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.008118 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.010184 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.062743 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:42068->10.217.0.172:9311: read: connection reset by peer" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.062848 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-758b9f6874-2lcsg" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:42074->10.217.0.172:9311: read: connection reset by peer" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.062977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.486588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9e894c53-51db-4ede-9730-b8c68ad6fc15","Type":"ContainerStarted","Data":"e9f7304c123288cc3c313213ed5fb40a183bc324661074d1a7c56e67658b4cf4"} Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.491434 4763 generic.go:334] "Generic (PLEG): container finished" podID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerID="230cb231d868037183ea0f6909557c737de8a2c4c2c301798612f96ca1075a0e" exitCode=0 Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.491505 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerDied","Data":"230cb231d868037183ea0f6909557c737de8a2c4c2c301798612f96ca1075a0e"} Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.495133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.495563 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.495737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e387c1cd-02b0-40cc-a958-2443971ae373","Type":"ContainerDied","Data":"a4fd7a6e66018d9d90f720d3264d350af946c22eeb8471f2208d33fcfb269ab2"} Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.495805 4763 scope.go:117] "RemoveContainer" containerID="3481ca6aba7c67025553d2c18bf1c91ec9bda5e34b1d058c0826eee35a353bd4" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.505278 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="60db4a8b-41c1-4ac3-be94-54eedb9fffc5" podUID="9e894c53-51db-4ede-9730-b8c68ad6fc15" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.539703 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.556238 4763 scope.go:117] "RemoveContainer" containerID="fc45c0c995101a2b2c673c31de378857293cbacfe513c30d5d4749e3fe22eeb4" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.611341 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.621851 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:11:00 crc kubenswrapper[4763]: E1205 12:11:00.622415 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="cinder-scheduler" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.622441 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="cinder-scheduler" Dec 05 12:11:00 crc kubenswrapper[4763]: E1205 12:11:00.622457 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="probe" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.622464 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="probe" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.622685 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="cinder-scheduler" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.622730 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" containerName="probe" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.624124 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.628299 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.628907 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.700336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.700600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.700891 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.701090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krbtm\" (UniqueName: \"kubernetes.io/projected/fd14a5e3-8def-4bc7-b375-8ae87dd75838-kube-api-access-krbtm\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.701229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.701354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd14a5e3-8def-4bc7-b375-8ae87dd75838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.737988 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.802393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle\") pod \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.802758 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs\") pod \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.802917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpcrp\" (UniqueName: \"kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp\") pod \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.803034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom\") pod \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.803163 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data\") pod \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\" (UID: \"91ed7d1b-612b-46e3-b99f-cb66cfc9e003\") " Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.803621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krbtm\" (UniqueName: \"kubernetes.io/projected/fd14a5e3-8def-4bc7-b375-8ae87dd75838-kube-api-access-krbtm\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.803762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.804029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd14a5e3-8def-4bc7-b375-8ae87dd75838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.804172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.804264 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.804374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.804638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs" (OuterVolumeSpecName: "logs") pod "91ed7d1b-612b-46e3-b99f-cb66cfc9e003" (UID: "91ed7d1b-612b-46e3-b99f-cb66cfc9e003"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.805537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd14a5e3-8def-4bc7-b375-8ae87dd75838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.818506 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp" (OuterVolumeSpecName: "kube-api-access-cpcrp") pod "91ed7d1b-612b-46e3-b99f-cb66cfc9e003" (UID: "91ed7d1b-612b-46e3-b99f-cb66cfc9e003"). InnerVolumeSpecName "kube-api-access-cpcrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.819064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.822701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91ed7d1b-612b-46e3-b99f-cb66cfc9e003" (UID: "91ed7d1b-612b-46e3-b99f-cb66cfc9e003"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.823376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.824116 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krbtm\" (UniqueName: \"kubernetes.io/projected/fd14a5e3-8def-4bc7-b375-8ae87dd75838-kube-api-access-krbtm\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.827021 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.838678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd14a5e3-8def-4bc7-b375-8ae87dd75838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd14a5e3-8def-4bc7-b375-8ae87dd75838\") " pod="openstack/cinder-scheduler-0" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.851430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91ed7d1b-612b-46e3-b99f-cb66cfc9e003" (UID: "91ed7d1b-612b-46e3-b99f-cb66cfc9e003"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.906071 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.906105 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.906113 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.906124 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpcrp\" (UniqueName: \"kubernetes.io/projected/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-kube-api-access-cpcrp\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:00 crc kubenswrapper[4763]: I1205 12:11:00.910029 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data" (OuterVolumeSpecName: "config-data") pod "91ed7d1b-612b-46e3-b99f-cb66cfc9e003" (UID: "91ed7d1b-612b-46e3-b99f-cb66cfc9e003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.007855 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91ed7d1b-612b-46e3-b99f-cb66cfc9e003-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.049014 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.507616 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-758b9f6874-2lcsg" event={"ID":"91ed7d1b-612b-46e3-b99f-cb66cfc9e003","Type":"ContainerDied","Data":"bd2a91405b594db5d486f8d5427ee117b21192361eff7d5d0e1d0e232255dae8"} Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.507992 4763 scope.go:117] "RemoveContainer" containerID="230cb231d868037183ea0f6909557c737de8a2c4c2c301798612f96ca1075a0e" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.507669 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-758b9f6874-2lcsg" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.544941 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.548893 4763 scope.go:117] "RemoveContainer" containerID="6e8f58885bc169f9772af5c4bc947eef4345bfb60959ae68f4f25616b6735a1d" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.554975 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-758b9f6874-2lcsg"] Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.569314 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.797419 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" path="/var/lib/kubelet/pods/91ed7d1b-612b-46e3-b99f-cb66cfc9e003/volumes" Dec 05 12:11:01 crc kubenswrapper[4763]: I1205 12:11:01.798291 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e387c1cd-02b0-40cc-a958-2443971ae373" path="/var/lib/kubelet/pods/e387c1cd-02b0-40cc-a958-2443971ae373/volumes" Dec 05 12:11:02 crc kubenswrapper[4763]: I1205 12:11:02.535453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd14a5e3-8def-4bc7-b375-8ae87dd75838","Type":"ContainerStarted","Data":"5521a372ea62108ad3b9dd9ace41b33f77cc10eeb13199c829a0e84421544ec6"} Dec 05 12:11:02 crc kubenswrapper[4763]: I1205 12:11:02.536485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd14a5e3-8def-4bc7-b375-8ae87dd75838","Type":"ContainerStarted","Data":"74f7d922202ae6997b7ed9f9d80236e5d2154da79d2fdd67d09e0dbf95f1461e"} Dec 05 12:11:03 crc kubenswrapper[4763]: I1205 12:11:03.570880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd14a5e3-8def-4bc7-b375-8ae87dd75838","Type":"ContainerStarted","Data":"3b29af7dc063eea6a1fdd4e5d7c1568bf36b5b1e5f8f2fc8d4f6910de4dc16cd"} Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.064966 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.064941961 podStartE2EDuration="5.064941961s" podCreationTimestamp="2025-12-05 12:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:03.592164928 +0000 UTC m=+1348.084879671" watchObservedRunningTime="2025-12-05 12:11:05.064941961 +0000 UTC m=+1349.557656684" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.068606 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c4499b47f-s4mh4"] Dec 05 12:11:05 crc kubenswrapper[4763]: E1205 12:11:05.069313 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.069331 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" Dec 05 12:11:05 crc kubenswrapper[4763]: E1205 12:11:05.069350 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.069356 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.069551 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.069569 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ed7d1b-612b-46e3-b99f-cb66cfc9e003" containerName="barbican-api-log" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.070560 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.073147 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.073749 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.073856 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.080440 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c4499b47f-s4mh4"] Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196143 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-internal-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dcxz\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-kube-api-access-2dcxz\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196610 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-public-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-log-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-combined-ca-bundle\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.196916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-run-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.197048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-config-data\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.197148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-etc-swift\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-internal-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dcxz\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-kube-api-access-2dcxz\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-public-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-log-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-combined-ca-bundle\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-run-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299241 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-config-data\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.299270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-etc-swift\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.300611 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-run-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.301065 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe2a82f8-601f-42ea-a495-4d1a03084267-log-httpd\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.308503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-combined-ca-bundle\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.312872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-internal-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.321239 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-public-tls-certs\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.321414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-etc-swift\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.321435 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe2a82f8-601f-42ea-a495-4d1a03084267-config-data\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.322136 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dcxz\" (UniqueName: \"kubernetes.io/projected/fe2a82f8-601f-42ea-a495-4d1a03084267-kube-api-access-2dcxz\") pod \"swift-proxy-5c4499b47f-s4mh4\" (UID: \"fe2a82f8-601f-42ea-a495-4d1a03084267\") " pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.323013 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 12:11:05 crc kubenswrapper[4763]: I1205 12:11:05.405435 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.002340 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.002854 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-central-agent" containerID="cri-o://2a493d78b9a0e23dc0ad9607673518f834848814c3799e79e9615799e013155f" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.003191 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="sg-core" containerID="cri-o://7a5aa79d6ff40c4c2bd4586bfcb50c18f7922c73abe58623b83e4277778fb853" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.003229 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-notification-agent" containerID="cri-o://3544d9156ebffb971fcc4e7d83f5f8fa0df0a4c1c8f6f691ce005c667e3e2467" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.003191 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="proxy-httpd" containerID="cri-o://d20e07f05eb3671b031be2015a996cff0a7d84bd4ade390b40f6663b9fae8637" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.008945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.049150 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.199808 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.200091 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-log" containerID="cri-o://9ee7f0116aec2115bfd52c5b0bd68412ad360aa2366959820ade81f6c5c53e48" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.200285 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-httpd" containerID="cri-o://13075e4233d39d8a765a3b36d2e3a81e1ec2314c209cc3d5ab6cff59152f353f" gracePeriod=30 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.620578 4763 generic.go:334] "Generic (PLEG): container finished" podID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerID="9ee7f0116aec2115bfd52c5b0bd68412ad360aa2366959820ade81f6c5c53e48" exitCode=143 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.620631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerDied","Data":"9ee7f0116aec2115bfd52c5b0bd68412ad360aa2366959820ade81f6c5c53e48"} Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.624988 4763 generic.go:334] "Generic (PLEG): container finished" podID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerID="d20e07f05eb3671b031be2015a996cff0a7d84bd4ade390b40f6663b9fae8637" exitCode=0 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.625015 4763 generic.go:334] "Generic (PLEG): container finished" podID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerID="7a5aa79d6ff40c4c2bd4586bfcb50c18f7922c73abe58623b83e4277778fb853" exitCode=2 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.625022 4763 generic.go:334] "Generic (PLEG): container finished" podID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerID="2a493d78b9a0e23dc0ad9607673518f834848814c3799e79e9615799e013155f" exitCode=0 Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.625040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerDied","Data":"d20e07f05eb3671b031be2015a996cff0a7d84bd4ade390b40f6663b9fae8637"} Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.625064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerDied","Data":"7a5aa79d6ff40c4c2bd4586bfcb50c18f7922c73abe58623b83e4277778fb853"} Dec 05 12:11:06 crc kubenswrapper[4763]: I1205 12:11:06.625072 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerDied","Data":"2a493d78b9a0e23dc0ad9607673518f834848814c3799e79e9615799e013155f"} Dec 05 12:11:07 crc kubenswrapper[4763]: I1205 12:11:07.544526 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:11:07 crc kubenswrapper[4763]: I1205 12:11:07.544803 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:11:07 crc kubenswrapper[4763]: I1205 12:11:07.918095 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:07 crc kubenswrapper[4763]: I1205 12:11:07.918611 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-log" containerID="cri-o://fada1b6bd3de60187d25cd6ce64a503d42a5a289412a9ad93e63575c6aecf799" gracePeriod=30 Dec 05 12:11:07 crc kubenswrapper[4763]: I1205 12:11:07.918950 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-httpd" containerID="cri-o://7442313666db8517b1aed2091d3065563ee73742e77582ff15eae44d27a742bb" gracePeriod=30 Dec 05 12:11:08 crc kubenswrapper[4763]: I1205 12:11:08.649242 4763 generic.go:334] "Generic (PLEG): container finished" podID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerID="fada1b6bd3de60187d25cd6ce64a503d42a5a289412a9ad93e63575c6aecf799" exitCode=143 Dec 05 12:11:08 crc kubenswrapper[4763]: I1205 12:11:08.649283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerDied","Data":"fada1b6bd3de60187d25cd6ce64a503d42a5a289412a9ad93e63575c6aecf799"} Dec 05 12:11:09 crc kubenswrapper[4763]: I1205 12:11:09.662549 4763 generic.go:334] "Generic (PLEG): container finished" podID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerID="13075e4233d39d8a765a3b36d2e3a81e1ec2314c209cc3d5ab6cff59152f353f" exitCode=0 Dec 05 12:11:09 crc kubenswrapper[4763]: I1205 12:11:09.662652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerDied","Data":"13075e4233d39d8a765a3b36d2e3a81e1ec2314c209cc3d5ab6cff59152f353f"} Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.688469 4763 generic.go:334] "Generic (PLEG): container finished" podID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerID="3544d9156ebffb971fcc4e7d83f5f8fa0df0a4c1c8f6f691ce005c667e3e2467" exitCode=0 Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.688708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerDied","Data":"3544d9156ebffb971fcc4e7d83f5f8fa0df0a4c1c8f6f691ce005c667e3e2467"} Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.802502 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6qwfq"] Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.805458 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.848817 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qwfq"] Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.910084 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-j9pcn"] Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.911165 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.911287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldq2z\" (UniqueName: \"kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.918888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.929622 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3545-account-create-update-k6q46"] Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.931125 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.935565 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.940122 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9pcn"] Dec 05 12:11:10 crc kubenswrapper[4763]: I1205 12:11:10.954338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3545-account-create-update-k6q46"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.013732 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.013910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldq2z\" (UniqueName: \"kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.013972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.014531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwrt\" (UniqueName: \"kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.014721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.015736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.015841 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmqd\" (UniqueName: \"kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.040079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldq2z\" (UniqueName: \"kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z\") pod \"nova-api-db-create-6qwfq\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.122121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmqd\" (UniqueName: \"kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.122373 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.122642 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.122950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwrt\" (UniqueName: \"kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.132075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.132982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.139120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.148470 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8n8v7"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.158444 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwrt\" (UniqueName: \"kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt\") pod \"nova-cell0-db-create-j9pcn\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.159186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.193303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmqd\" (UniqueName: \"kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd\") pod \"nova-api-3545-account-create-update-k6q46\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.206352 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-55e0-account-create-update-kmpcw"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.207887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.210184 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.228096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8n8v7"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.238987 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.239077 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vdn\" (UniqueName: \"kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.246631 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-55e0-account-create-update-kmpcw"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.264065 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.273876 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.341185 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.341324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclsn\" (UniqueName: \"kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.341393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.341450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vdn\" (UniqueName: \"kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.342462 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.342505 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1309-account-create-update-55ht7"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.343840 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.349719 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.363735 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1309-account-create-update-55ht7"] Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.373364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vdn\" (UniqueName: \"kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn\") pod \"nova-cell1-db-create-8n8v7\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.443998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2js\" (UniqueName: \"kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.444070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.444108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.444137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rclsn\" (UniqueName: \"kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.445076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.465469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclsn\" (UniqueName: \"kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn\") pod \"nova-cell0-55e0-account-create-update-kmpcw\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.546156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2js\" (UniqueName: \"kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.546224 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.547325 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.563597 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2js\" (UniqueName: \"kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js\") pod \"nova-cell1-1309-account-create-update-55ht7\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.613809 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.620205 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.636410 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.676679 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.712317 4763 generic.go:334] "Generic (PLEG): container finished" podID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerID="7442313666db8517b1aed2091d3065563ee73742e77582ff15eae44d27a742bb" exitCode=0 Dec 05 12:11:11 crc kubenswrapper[4763]: I1205 12:11:11.712396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerDied","Data":"7442313666db8517b1aed2091d3065563ee73742e77582ff15eae44d27a742bb"} Dec 05 12:11:12 crc kubenswrapper[4763]: I1205 12:11:12.727740 4763 generic.go:334] "Generic (PLEG): container finished" podID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerID="0f9fce0cd10290e14fb000bbba92b5ee8865c914d6ce43d7be912c918930bcb6" exitCode=137 Dec 05 12:11:12 crc kubenswrapper[4763]: I1205 12:11:12.727943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerDied","Data":"0f9fce0cd10290e14fb000bbba92b5ee8865c914d6ce43d7be912c918930bcb6"} Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.612673 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703449 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703517 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703547 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j89x\" (UniqueName: \"kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.703799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd\") pod \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\" (UID: \"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024\") " Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.709096 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.713752 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.722337 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x" (OuterVolumeSpecName: "kube-api-access-8j89x") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "kube-api-access-8j89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.727921 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts" (OuterVolumeSpecName: "scripts") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.777566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.817023 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.817052 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j89x\" (UniqueName: \"kubernetes.io/projected/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-kube-api-access-8j89x\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.817062 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.817071 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.817079 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.831213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"387bbd61-fa4c-49e7-b5b8-d7c0b56e8024","Type":"ContainerDied","Data":"8b60a8f3eef0de9d1e54359599955cbc899aa128ab2f5f5227fb383fe151f46f"} Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.831272 4763 scope.go:117] "RemoveContainer" containerID="d20e07f05eb3671b031be2015a996cff0a7d84bd4ade390b40f6663b9fae8637" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.831452 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.873013 4763 scope.go:117] "RemoveContainer" containerID="7a5aa79d6ff40c4c2bd4586bfcb50c18f7922c73abe58623b83e4277778fb853" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.909266 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.921254 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.933274 4763 scope.go:117] "RemoveContainer" containerID="3544d9156ebffb971fcc4e7d83f5f8fa0df0a4c1c8f6f691ce005c667e3e2467" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.954103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data" (OuterVolumeSpecName: "config-data") pod "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" (UID: "387bbd61-fa4c-49e7-b5b8-d7c0b56e8024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.960669 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:11:13 crc kubenswrapper[4763]: I1205 12:11:13.987655 4763 scope.go:117] "RemoveContainer" containerID="2a493d78b9a0e23dc0ad9607673518f834848814c3799e79e9615799e013155f" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.022673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.024986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.024899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs" (OuterVolumeSpecName: "logs") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.025292 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.025444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z8l8\" (UniqueName: \"kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.025542 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.025732 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.025885 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts\") pod \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\" (UID: \"4ba0cbf0-3e4e-4cb0-82b0-179d11937330\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.028738 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.028853 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.034352 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.053923 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc5f5e3fa-8226-4730-9a1a-899e0a2ab5e2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc5f5e3fa-8226-4730-9a1a-899e0a2ab5e2] : Timed out while waiting for systemd to remove kubepods-besteffort-podc5f5e3fa_8226_4730_9a1a_899e0a2ab5e2.slice" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.053991 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podc5f5e3fa-8226-4730-9a1a-899e0a2ab5e2] : unable to destroy cgroup paths for cgroup [kubepods besteffort podc5f5e3fa-8226-4730-9a1a-899e0a2ab5e2] : Timed out while waiting for systemd to remove kubepods-besteffort-podc5f5e3fa_8226_4730_9a1a_899e0a2ab5e2.slice" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.055109 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.055718 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" containerName="kube-state-metrics" containerID="cri-o://59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051" gracePeriod=30 Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.093846 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.115338 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8" (OuterVolumeSpecName: "kube-api-access-2z8l8") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "kube-api-access-2z8l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.115472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts" (OuterVolumeSpecName: "scripts") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.116397 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data" (OuterVolumeSpecName: "config-data") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.121120 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ba0cbf0-3e4e-4cb0-82b0-179d11937330" (UID: "4ba0cbf0-3e4e-4cb0-82b0-179d11937330"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150290 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150327 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z8l8\" (UniqueName: \"kubernetes.io/projected/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-kube-api-access-2z8l8\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150343 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150355 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150369 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.150387 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ba0cbf0-3e4e-4cb0-82b0-179d11937330-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: W1205 12:11:14.308232 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e08d98c_50ec_4dd9_a454_5cd1c19c4067.slice/crio-225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc WatchSource:0}: Error finding container 225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc: Status 404 returned error can't find the container with id 225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.313372 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-55e0-account-create-update-kmpcw"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.359743 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.390034 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.411135 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.428837 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429357 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429386 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="proxy-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429396 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="proxy-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429418 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-central-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429426 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-central-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429443 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429450 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429460 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="sg-core" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429467 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="sg-core" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429483 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon-log" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429491 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon-log" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429512 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-notification-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429521 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-notification-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: E1205 12:11:14.429544 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-log" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429551 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-log" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429928 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-log" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.429947 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="proxy-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430023 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon-log" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430040 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="sg-core" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430058 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" containerName="horizon" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430071 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" containerName="glance-httpd" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430088 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-notification-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.430103 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" containerName="ceilometer-central-agent" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.434690 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.438597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.438946 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459047 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txmwf\" (UniqueName: \"kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459150 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459308 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459382 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.459407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.460090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs\") pod \"e680db34-6f7c-4e72-8015-368c51bb34b0\" (UID: \"e680db34-6f7c-4e72-8015-368c51bb34b0\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.460513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.462050 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs" (OuterVolumeSpecName: "logs") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.466652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.474212 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.493555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts" (OuterVolumeSpecName: "scripts") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.493601 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf" (OuterVolumeSpecName: "kube-api-access-txmwf") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "kube-api-access-txmwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.531361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.554933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.560995 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data" (OuterVolumeSpecName: "config-data") pod "e680db34-6f7c-4e72-8015-368c51bb34b0" (UID: "e680db34-6f7c-4e72-8015-368c51bb34b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563233 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563461 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmhs\" (UniqueName: \"kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563623 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563669 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txmwf\" (UniqueName: \"kubernetes.io/projected/e680db34-6f7c-4e72-8015-368c51bb34b0-kube-api-access-txmwf\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563682 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563708 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563721 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563733 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563744 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e680db34-6f7c-4e72-8015-368c51bb34b0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.563759 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e680db34-6f7c-4e72-8015-368c51bb34b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.603009 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665267 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665423 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665473 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmhs\" (UniqueName: \"kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.665654 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.667061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.667982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.671941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.672706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.675164 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.678268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.688013 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmhs\" (UniqueName: \"kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs\") pod \"ceilometer-0\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.784986 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.796406 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.845911 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868502 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868568 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkl6c\" (UniqueName: \"kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c\") pod \"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975\" (UID: \"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868700 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868741 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.868974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.869082 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq7zh\" (UniqueName: \"kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh\") pod \"06f0e50f-35b7-441d-a630-0655b4c1cd00\" (UID: \"06f0e50f-35b7-441d-a630-0655b4c1cd00\") " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.870389 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.871587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9e894c53-51db-4ede-9730-b8c68ad6fc15","Type":"ContainerStarted","Data":"da52fd0bd598d7f99b61bf4ffd62b4b7b03466646b97a4f983082d17d1bd843e"} Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.874032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh" (OuterVolumeSpecName: "kube-api-access-gq7zh") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "kube-api-access-gq7zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.875990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs" (OuterVolumeSpecName: "logs") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.886485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.886701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts" (OuterVolumeSpecName: "scripts") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.901890 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c" (OuterVolumeSpecName: "kube-api-access-zkl6c") pod "6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" (UID: "6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975"). InnerVolumeSpecName "kube-api-access-zkl6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.907973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" event={"ID":"2e08d98c-50ec-4dd9-a454-5cd1c19c4067","Type":"ContainerStarted","Data":"17c8faa85f0b36d302516e57a0aae40135ec8ca871dc1750896db9d4b6734031"} Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.908011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" event={"ID":"2e08d98c-50ec-4dd9-a454-5cd1c19c4067","Type":"ContainerStarted","Data":"225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc"} Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.919240 4763 generic.go:334] "Generic (PLEG): container finished" podID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" containerID="59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051" exitCode=2 Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.919298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975","Type":"ContainerDied","Data":"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051"} Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.919322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975","Type":"ContainerDied","Data":"255860ad9e0ffa789d62b7a278e0921ba0d69c1f9122a9d8a85477a4c013c1c9"} Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.919337 4763 scope.go:117] "RemoveContainer" containerID="59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.919405 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.937433 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1309-account-create-update-55ht7"] Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.946039 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.510159264 podStartE2EDuration="15.946013055s" podCreationTimestamp="2025-12-05 12:10:59 +0000 UTC" firstStartedPulling="2025-12-05 12:11:00.009978988 +0000 UTC m=+1344.502693711" lastFinishedPulling="2025-12-05 12:11:13.445832779 +0000 UTC m=+1357.938547502" observedRunningTime="2025-12-05 12:11:14.911510628 +0000 UTC m=+1359.404225351" watchObservedRunningTime="2025-12-05 12:11:14.946013055 +0000 UTC m=+1359.438727768" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971175 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq7zh\" (UniqueName: \"kubernetes.io/projected/06f0e50f-35b7-441d-a630-0655b4c1cd00-kube-api-access-gq7zh\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971212 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkl6c\" (UniqueName: \"kubernetes.io/projected/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975-kube-api-access-zkl6c\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971230 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971240 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971249 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.971258 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f0e50f-35b7-441d-a630-0655b4c1cd00-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.984543 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.994735 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:11:14 crc kubenswrapper[4763]: I1205 12:11:14.994733 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc58bc884-khdbv" event={"ID":"4ba0cbf0-3e4e-4cb0-82b0-179d11937330","Type":"ContainerDied","Data":"6d3054201aea8506c8e8e9a592fcffa7d9479ebf438cfafd9f83299a7f88a265"} Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.002549 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.005041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e680db34-6f7c-4e72-8015-368c51bb34b0","Type":"ContainerDied","Data":"35d483e9bf8d24b0e940c41602403f042b67ff7ba28ded520e48cc008f6e069c"} Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.005119 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.028638 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" podStartSLOduration=4.028614731 podStartE2EDuration="4.028614731s" podCreationTimestamp="2025-12-05 12:11:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:14.931386262 +0000 UTC m=+1359.424100985" watchObservedRunningTime="2025-12-05 12:11:15.028614731 +0000 UTC m=+1359.521329464" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.051182 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.061113 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-7tdkh" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.062581 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.062806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"06f0e50f-35b7-441d-a630-0655b4c1cd00","Type":"ContainerDied","Data":"d29a6f5cae9788281c3661b53406b36706c72b23ac49b60e6e384e1357c6d39b"} Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.077444 4763 scope.go:117] "RemoveContainer" containerID="59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.079183 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.079202 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.079212 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:15 crc kubenswrapper[4763]: E1205 12:11:15.079289 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051\": container with ID starting with 59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051 not found: ID does not exist" containerID="59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.079311 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051"} err="failed to get container status \"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051\": rpc error: code = NotFound desc = could not find container \"59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051\": container with ID starting with 59020d82ff3f772c426279e952d15a49ac261b9a9407e2203d18e354d832e051 not found: ID does not exist" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.079331 4763 scope.go:117] "RemoveContainer" containerID="db7f9809d3f9be8f6f6d0b3eccb2f6c7b682e2f5b570dcc420bac86ce64e99f5" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.137466 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8n8v7"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.146336 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data" (OuterVolumeSpecName: "config-data") pod "06f0e50f-35b7-441d-a630-0655b4c1cd00" (UID: "06f0e50f-35b7-441d-a630-0655b4c1cd00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.193215 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qwfq"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.194604 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f0e50f-35b7-441d-a630-0655b4c1cd00-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.220575 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.250155 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.260102 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: E1205 12:11:15.267708 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" containerName="kube-state-metrics" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.268068 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" containerName="kube-state-metrics" Dec 05 12:11:15 crc kubenswrapper[4763]: E1205 12:11:15.268105 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-log" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.268212 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-log" Dec 05 12:11:15 crc kubenswrapper[4763]: E1205 12:11:15.268229 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-httpd" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.268239 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-httpd" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.268962 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-log" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.268985 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" containerName="glance-httpd" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.269103 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" containerName="kube-state-metrics" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.271351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9pcn"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.271598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.274791 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.275062 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.288697 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.306085 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3545-account-create-update-k6q46"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.324826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c4499b47f-s4mh4"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.400379 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfps2\" (UniqueName: \"kubernetes.io/projected/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-api-access-bfps2\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.400480 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.400552 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.400582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.510108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfps2\" (UniqueName: \"kubernetes.io/projected/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-api-access-bfps2\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.510543 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.510629 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.510676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.558564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.558633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.559000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.572927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfps2\" (UniqueName: \"kubernetes.io/projected/4521fb51-39ad-4717-8239-8d2a759d4a30-kube-api-access-bfps2\") pod \"kube-state-metrics-0\" (UID: \"4521fb51-39ad-4717-8239-8d2a759d4a30\") " pod="openstack/kube-state-metrics-0" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.806556 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387bbd61-fa4c-49e7-b5b8-d7c0b56e8024" path="/var/lib/kubelet/pods/387bbd61-fa4c-49e7-b5b8-d7c0b56e8024/volumes" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.807474 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975" path="/var/lib/kubelet/pods/6c917b4b-cb2c-4d9e-91b8-5fa2f1f61975/volumes" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.808281 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.935718 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.945074 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.968025 4763 scope.go:117] "RemoveContainer" containerID="0f9fce0cd10290e14fb000bbba92b5ee8865c914d6ce43d7be912c918930bcb6" Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.992245 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:15 crc kubenswrapper[4763]: I1205 12:11:15.995084 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.003232 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5tbdp" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.003406 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.003533 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.003677 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.003912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.004628 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.011633 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.025488 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-7tdkh"] Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.041098 4763 scope.go:117] "RemoveContainer" containerID="13075e4233d39d8a765a3b36d2e3a81e1ec2314c209cc3d5ab6cff59152f353f" Dec 05 12:11:16 crc kubenswrapper[4763]: W1205 12:11:16.047492 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aecdd42_0f16_4f9a_91d4_3512b21b3763.slice/crio-e576789f4e0172914bcbb54bc3a1017b4d88d579c2637c3126ecf165fe860801 WatchSource:0}: Error finding container e576789f4e0172914bcbb54bc3a1017b4d88d579c2637c3126ecf165fe860801: Status 404 returned error can't find the container with id e576789f4e0172914bcbb54bc3a1017b4d88d579c2637c3126ecf165fe860801 Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.085678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1309-account-create-update-55ht7" event={"ID":"035ebd9d-2632-4e8c-9912-bf071d4a02e6","Type":"ContainerStarted","Data":"a43a37d69c76d61ef2632bfd2f63dede2f48de5c3b4eadb082785384db7a68c7"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.087754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4499b47f-s4mh4" event={"ID":"fe2a82f8-601f-42ea-a495-4d1a03084267","Type":"ContainerStarted","Data":"b98275a6f1bb168a8dd309b77eb0ab3d2f29cf404151add11e0130098a2d58c8"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.089494 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwfq" event={"ID":"0ba97fe5-94ae-43d2-b059-524ead71f164","Type":"ContainerStarted","Data":"f042d336b5bb0fab62d705476f46b268001cf0f988c7c027563d606d59bdf03a"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.091012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8n8v7" event={"ID":"5c8be5bc-66ad-46a2-867d-965dc226273a","Type":"ContainerStarted","Data":"07d98724894f3b4a1917e72b11c3ff522b74af2240e8334c8934fa08f60feed9"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.092593 4763 generic.go:334] "Generic (PLEG): container finished" podID="2e08d98c-50ec-4dd9-a454-5cd1c19c4067" containerID="17c8faa85f0b36d302516e57a0aae40135ec8ca871dc1750896db9d4b6734031" exitCode=0 Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.092636 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" event={"ID":"2e08d98c-50ec-4dd9-a454-5cd1c19c4067","Type":"ContainerDied","Data":"17c8faa85f0b36d302516e57a0aae40135ec8ca871dc1750896db9d4b6734031"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.098322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9pcn" event={"ID":"aec4d7da-472f-449f-8ef2-0515e74f614a","Type":"ContainerStarted","Data":"6b677fb7e0f42749ade36905d9e55c36f1624e9227bd1f66caa1e1817df51387"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.100001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3545-account-create-update-k6q46" event={"ID":"ab9820ac-9442-45a8-9407-d2abab068843","Type":"ContainerStarted","Data":"3f2b5da7db5a78dcf7e5ce6c6d443edbe29acdb85114691757a042cdcfc405bf"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.103909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerStarted","Data":"e576789f4e0172914bcbb54bc3a1017b4d88d579c2637c3126ecf165fe860801"} Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.107135 4763 scope.go:117] "RemoveContainer" containerID="9ee7f0116aec2115bfd52c5b0bd68412ad360aa2366959820ade81f6c5c53e48" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.126852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.126898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.126940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcs2\" (UniqueName: \"kubernetes.io/projected/cdedbb4d-c325-420f-946f-942359580cfe-kube-api-access-zmcs2\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.127049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.128084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.128467 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.128510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.128692 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.138795 4763 scope.go:117] "RemoveContainer" containerID="7442313666db8517b1aed2091d3065563ee73742e77582ff15eae44d27a742bb" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.185097 4763 scope.go:117] "RemoveContainer" containerID="fada1b6bd3de60187d25cd6ce64a503d42a5a289412a9ad93e63575c6aecf799" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.232306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.232770 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.249990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcs2\" (UniqueName: \"kubernetes.io/projected/cdedbb4d-c325-420f-946f-942359580cfe-kube-api-access-zmcs2\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.250337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.254686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.254985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdedbb4d-c325-420f-946f-942359580cfe-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.262054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.263138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.263578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.265744 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdedbb4d-c325-420f-946f-942359580cfe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.308728 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcs2\" (UniqueName: \"kubernetes.io/projected/cdedbb4d-c325-420f-946f-942359580cfe-kube-api-access-zmcs2\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.331908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdedbb4d-c325-420f-946f-942359580cfe\") " pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.396342 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:16 crc kubenswrapper[4763]: I1205 12:11:16.690452 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 12:11:16 crc kubenswrapper[4763]: W1205 12:11:16.747465 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4521fb51_39ad_4717_8239_8d2a759d4a30.slice/crio-cdad850e3d36fe102984280be75d04a0eb9f8bdb9e0c373efec26fb63137fc09 WatchSource:0}: Error finding container cdad850e3d36fe102984280be75d04a0eb9f8bdb9e0c373efec26fb63137fc09: Status 404 returned error can't find the container with id cdad850e3d36fe102984280be75d04a0eb9f8bdb9e0c373efec26fb63137fc09 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.045940 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.126230 4763 generic.go:334] "Generic (PLEG): container finished" podID="aec4d7da-472f-449f-8ef2-0515e74f614a" containerID="cf1f7f2890fe946d91809df395e226ff714951c2e65eb1fb669a67743b62a238" exitCode=0 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.126293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9pcn" event={"ID":"aec4d7da-472f-449f-8ef2-0515e74f614a","Type":"ContainerDied","Data":"cf1f7f2890fe946d91809df395e226ff714951c2e65eb1fb669a67743b62a238"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.128797 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ba97fe5-94ae-43d2-b059-524ead71f164" containerID="18f458293751772312c57f261e262add7679e4dc1d64c7bc01771dc598bcbdf5" exitCode=0 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.128875 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwfq" event={"ID":"0ba97fe5-94ae-43d2-b059-524ead71f164","Type":"ContainerDied","Data":"18f458293751772312c57f261e262add7679e4dc1d64c7bc01771dc598bcbdf5"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.133720 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c8be5bc-66ad-46a2-867d-965dc226273a" containerID="aa2d37d003ec398bf7442e39cc4b9c3ef6d25d10bce17106a8f4e87404bdfadd" exitCode=0 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.133829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8n8v7" event={"ID":"5c8be5bc-66ad-46a2-867d-965dc226273a","Type":"ContainerDied","Data":"aa2d37d003ec398bf7442e39cc4b9c3ef6d25d10bce17106a8f4e87404bdfadd"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.149477 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab9820ac-9442-45a8-9407-d2abab068843" containerID="f4a411ab04d9ba77a8a5e5ea540d662384dde9c8fb0dacec6de6ff7955d78ced" exitCode=0 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.149687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3545-account-create-update-k6q46" event={"ID":"ab9820ac-9442-45a8-9407-d2abab068843","Type":"ContainerDied","Data":"f4a411ab04d9ba77a8a5e5ea540d662384dde9c8fb0dacec6de6ff7955d78ced"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.159876 4763 generic.go:334] "Generic (PLEG): container finished" podID="035ebd9d-2632-4e8c-9912-bf071d4a02e6" containerID="9262c554aa59ae3935c97d723ea55580817b2ae355f9017617a1c80b9275dffa" exitCode=0 Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.160057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1309-account-create-update-55ht7" event={"ID":"035ebd9d-2632-4e8c-9912-bf071d4a02e6","Type":"ContainerDied","Data":"9262c554aa59ae3935c97d723ea55580817b2ae355f9017617a1c80b9275dffa"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.165746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerStarted","Data":"5f3f7453d0c45f14c208cbc9f6364423283f1e45c4de1319f57567d584ccc7fd"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.169431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4499b47f-s4mh4" event={"ID":"fe2a82f8-601f-42ea-a495-4d1a03084267","Type":"ContainerStarted","Data":"5a2bd0a4d3464ccdbe54b1daf0ed49971fcfd00adebf4a9e07fdd257a602a37b"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.169475 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4499b47f-s4mh4" event={"ID":"fe2a82f8-601f-42ea-a495-4d1a03084267","Type":"ContainerStarted","Data":"801682dcb54b6d017cef0ba979099188aab37fc5c4027d5e0dbe64594190e5fb"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.170273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.171152 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.174910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdedbb4d-c325-420f-946f-942359580cfe","Type":"ContainerStarted","Data":"d32ea36d34eb9b5b5e3fc0e0d84a28faa2227ecf3535c7629c868d52c09c8a61"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.200275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4521fb51-39ad-4717-8239-8d2a759d4a30","Type":"ContainerStarted","Data":"cdad850e3d36fe102984280be75d04a0eb9f8bdb9e0c373efec26fb63137fc09"} Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.218736 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c4499b47f-s4mh4" podStartSLOduration=12.21871483 podStartE2EDuration="12.21871483s" podCreationTimestamp="2025-12-05 12:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:17.191918109 +0000 UTC m=+1361.684632862" watchObservedRunningTime="2025-12-05 12:11:17.21871483 +0000 UTC m=+1361.711429553" Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.341131 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.814977 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2" path="/var/lib/kubelet/pods/c5f5e3fa-8226-4730-9a1a-899e0a2ab5e2/volumes" Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.815892 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e680db34-6f7c-4e72-8015-368c51bb34b0" path="/var/lib/kubelet/pods/e680db34-6f7c-4e72-8015-368c51bb34b0/volumes" Dec 05 12:11:17 crc kubenswrapper[4763]: I1205 12:11:17.873415 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.011399 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts\") pod \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.011534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rclsn\" (UniqueName: \"kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn\") pod \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\" (UID: \"2e08d98c-50ec-4dd9-a454-5cd1c19c4067\") " Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.013024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e08d98c-50ec-4dd9-a454-5cd1c19c4067" (UID: "2e08d98c-50ec-4dd9-a454-5cd1c19c4067"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.113963 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.182016 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn" (OuterVolumeSpecName: "kube-api-access-rclsn") pod "2e08d98c-50ec-4dd9-a454-5cd1c19c4067" (UID: "2e08d98c-50ec-4dd9-a454-5cd1c19c4067"). InnerVolumeSpecName "kube-api-access-rclsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.217692 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rclsn\" (UniqueName: \"kubernetes.io/projected/2e08d98c-50ec-4dd9-a454-5cd1c19c4067-kube-api-access-rclsn\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.219425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4521fb51-39ad-4717-8239-8d2a759d4a30","Type":"ContainerStarted","Data":"3b2bc7ae74df3121763c84585412157a5730e0741ba892886b651f53ef81d11e"} Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.219489 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.229084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" event={"ID":"2e08d98c-50ec-4dd9-a454-5cd1c19c4067","Type":"ContainerDied","Data":"225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc"} Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.229115 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225ecf73884fa9f44e66f77a0bc29c84342e242d5560acf13d1a803505ae36bc" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.229210 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-55e0-account-create-update-kmpcw" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.261380 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.86469369 podStartE2EDuration="4.261357742s" podCreationTimestamp="2025-12-05 12:11:14 +0000 UTC" firstStartedPulling="2025-12-05 12:11:16.752328028 +0000 UTC m=+1361.245042751" lastFinishedPulling="2025-12-05 12:11:17.14899208 +0000 UTC m=+1361.641706803" observedRunningTime="2025-12-05 12:11:18.241488394 +0000 UTC m=+1362.734203137" watchObservedRunningTime="2025-12-05 12:11:18.261357742 +0000 UTC m=+1362.754072495" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.270161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdedbb4d-c325-420f-946f-942359580cfe","Type":"ContainerStarted","Data":"508488918f4096aa27c345acecd526fa7001017df13ce085bbfa3bb1bdd1bbbd"} Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.755096 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.844542 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwrt\" (UniqueName: \"kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt\") pod \"aec4d7da-472f-449f-8ef2-0515e74f614a\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.844988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts\") pod \"aec4d7da-472f-449f-8ef2-0515e74f614a\" (UID: \"aec4d7da-472f-449f-8ef2-0515e74f614a\") " Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.845990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aec4d7da-472f-449f-8ef2-0515e74f614a" (UID: "aec4d7da-472f-449f-8ef2-0515e74f614a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.861567 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt" (OuterVolumeSpecName: "kube-api-access-ccwrt") pod "aec4d7da-472f-449f-8ef2-0515e74f614a" (UID: "aec4d7da-472f-449f-8ef2-0515e74f614a"). InnerVolumeSpecName "kube-api-access-ccwrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.949593 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec4d7da-472f-449f-8ef2-0515e74f614a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:18 crc kubenswrapper[4763]: I1205 12:11:18.949621 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwrt\" (UniqueName: \"kubernetes.io/projected/aec4d7da-472f-449f-8ef2-0515e74f614a-kube-api-access-ccwrt\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.010460 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.024894 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.059305 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.070823 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.160655 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46vdn\" (UniqueName: \"kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn\") pod \"5c8be5bc-66ad-46a2-867d-965dc226273a\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.162048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2js\" (UniqueName: \"kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js\") pod \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.162264 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmqd\" (UniqueName: \"kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd\") pod \"ab9820ac-9442-45a8-9407-d2abab068843\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.162530 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts\") pod \"0ba97fe5-94ae-43d2-b059-524ead71f164\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.162715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts\") pod \"ab9820ac-9442-45a8-9407-d2abab068843\" (UID: \"ab9820ac-9442-45a8-9407-d2abab068843\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.162968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts\") pod \"5c8be5bc-66ad-46a2-867d-965dc226273a\" (UID: \"5c8be5bc-66ad-46a2-867d-965dc226273a\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.163095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts\") pod \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\" (UID: \"035ebd9d-2632-4e8c-9912-bf071d4a02e6\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.163288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldq2z\" (UniqueName: \"kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z\") pod \"0ba97fe5-94ae-43d2-b059-524ead71f164\" (UID: \"0ba97fe5-94ae-43d2-b059-524ead71f164\") " Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.164258 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c8be5bc-66ad-46a2-867d-965dc226273a" (UID: "5c8be5bc-66ad-46a2-867d-965dc226273a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.164585 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ba97fe5-94ae-43d2-b059-524ead71f164" (UID: "0ba97fe5-94ae-43d2-b059-524ead71f164"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.165109 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "035ebd9d-2632-4e8c-9912-bf071d4a02e6" (UID: "035ebd9d-2632-4e8c-9912-bf071d4a02e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.165804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab9820ac-9442-45a8-9407-d2abab068843" (UID: "ab9820ac-9442-45a8-9407-d2abab068843"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.172977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn" (OuterVolumeSpecName: "kube-api-access-46vdn") pod "5c8be5bc-66ad-46a2-867d-965dc226273a" (UID: "5c8be5bc-66ad-46a2-867d-965dc226273a"). InnerVolumeSpecName "kube-api-access-46vdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.173385 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z" (OuterVolumeSpecName: "kube-api-access-ldq2z") pod "0ba97fe5-94ae-43d2-b059-524ead71f164" (UID: "0ba97fe5-94ae-43d2-b059-524ead71f164"). InnerVolumeSpecName "kube-api-access-ldq2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.178745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js" (OuterVolumeSpecName: "kube-api-access-tl2js") pod "035ebd9d-2632-4e8c-9912-bf071d4a02e6" (UID: "035ebd9d-2632-4e8c-9912-bf071d4a02e6"). InnerVolumeSpecName "kube-api-access-tl2js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.195885 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd" (OuterVolumeSpecName: "kube-api-access-rmmqd") pod "ab9820ac-9442-45a8-9407-d2abab068843" (UID: "ab9820ac-9442-45a8-9407-d2abab068843"). InnerVolumeSpecName "kube-api-access-rmmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267344 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmmqd\" (UniqueName: \"kubernetes.io/projected/ab9820ac-9442-45a8-9407-d2abab068843-kube-api-access-rmmqd\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267377 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ba97fe5-94ae-43d2-b059-524ead71f164-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267423 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9820ac-9442-45a8-9407-d2abab068843-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267433 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c8be5bc-66ad-46a2-867d-965dc226273a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267444 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035ebd9d-2632-4e8c-9912-bf071d4a02e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267453 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldq2z\" (UniqueName: \"kubernetes.io/projected/0ba97fe5-94ae-43d2-b059-524ead71f164-kube-api-access-ldq2z\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267462 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46vdn\" (UniqueName: \"kubernetes.io/projected/5c8be5bc-66ad-46a2-867d-965dc226273a-kube-api-access-46vdn\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.267484 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2js\" (UniqueName: \"kubernetes.io/projected/035ebd9d-2632-4e8c-9912-bf071d4a02e6-kube-api-access-tl2js\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.279044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8n8v7" event={"ID":"5c8be5bc-66ad-46a2-867d-965dc226273a","Type":"ContainerDied","Data":"07d98724894f3b4a1917e72b11c3ff522b74af2240e8334c8934fa08f60feed9"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.279291 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d98724894f3b4a1917e72b11c3ff522b74af2240e8334c8934fa08f60feed9" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.279410 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8n8v7" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.280909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9pcn" event={"ID":"aec4d7da-472f-449f-8ef2-0515e74f614a","Type":"ContainerDied","Data":"6b677fb7e0f42749ade36905d9e55c36f1624e9227bd1f66caa1e1817df51387"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.280957 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b677fb7e0f42749ade36905d9e55c36f1624e9227bd1f66caa1e1817df51387" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.280978 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9pcn" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.286090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3545-account-create-update-k6q46" event={"ID":"ab9820ac-9442-45a8-9407-d2abab068843","Type":"ContainerDied","Data":"3f2b5da7db5a78dcf7e5ce6c6d443edbe29acdb85114691757a042cdcfc405bf"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.286123 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3545-account-create-update-k6q46" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.286147 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2b5da7db5a78dcf7e5ce6c6d443edbe29acdb85114691757a042cdcfc405bf" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.288749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1309-account-create-update-55ht7" event={"ID":"035ebd9d-2632-4e8c-9912-bf071d4a02e6","Type":"ContainerDied","Data":"a43a37d69c76d61ef2632bfd2f63dede2f48de5c3b4eadb082785384db7a68c7"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.288801 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a43a37d69c76d61ef2632bfd2f63dede2f48de5c3b4eadb082785384db7a68c7" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.288923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1309-account-create-update-55ht7" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.296034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerStarted","Data":"bcdf7f43b9c4f2c24523d14abb30c7e2d9b225fd8a89565bfb100aae400014c8"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.300731 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwfq" Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.300789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwfq" event={"ID":"0ba97fe5-94ae-43d2-b059-524ead71f164","Type":"ContainerDied","Data":"f042d336b5bb0fab62d705476f46b268001cf0f988c7c027563d606d59bdf03a"} Dec 05 12:11:19 crc kubenswrapper[4763]: I1205 12:11:19.300815 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f042d336b5bb0fab62d705476f46b268001cf0f988c7c027563d606d59bdf03a" Dec 05 12:11:20 crc kubenswrapper[4763]: I1205 12:11:20.311016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdedbb4d-c325-420f-946f-942359580cfe","Type":"ContainerStarted","Data":"da086e84323af35d1d1f399fe1850b298d35eb998f2effa2622b2cc337c3d9f1"} Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.718316 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.7182986190000005 podStartE2EDuration="6.718298619s" podCreationTimestamp="2025-12-05 12:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:20.3333185 +0000 UTC m=+1364.826033243" watchObservedRunningTime="2025-12-05 12:11:21.718298619 +0000 UTC m=+1366.211013342" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.726919 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kf95k"] Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727394 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035ebd9d-2632-4e8c-9912-bf071d4a02e6" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727415 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ebd9d-2632-4e8c-9912-bf071d4a02e6" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727432 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08d98c-50ec-4dd9-a454-5cd1c19c4067" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727440 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08d98c-50ec-4dd9-a454-5cd1c19c4067" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727464 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8be5bc-66ad-46a2-867d-965dc226273a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727473 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8be5bc-66ad-46a2-867d-965dc226273a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727492 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec4d7da-472f-449f-8ef2-0515e74f614a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727502 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec4d7da-472f-449f-8ef2-0515e74f614a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727527 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9820ac-9442-45a8-9407-d2abab068843" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727536 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9820ac-9442-45a8-9407-d2abab068843" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: E1205 12:11:21.727550 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba97fe5-94ae-43d2-b059-524ead71f164" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727558 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba97fe5-94ae-43d2-b059-524ead71f164" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727801 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8be5bc-66ad-46a2-867d-965dc226273a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727819 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9820ac-9442-45a8-9407-d2abab068843" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727828 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="035ebd9d-2632-4e8c-9912-bf071d4a02e6" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727846 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e08d98c-50ec-4dd9-a454-5cd1c19c4067" containerName="mariadb-account-create-update" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727858 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec4d7da-472f-449f-8ef2-0515e74f614a" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.727871 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba97fe5-94ae-43d2-b059-524ead71f164" containerName="mariadb-database-create" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.728683 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.731394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.731666 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6prfx" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.731862 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.741349 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kf95k"] Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.815988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.816088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.816236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8cw4\" (UniqueName: \"kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.816287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.918438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.918501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.918632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8cw4\" (UniqueName: \"kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.920017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.932404 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.939033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.944085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8cw4\" (UniqueName: \"kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:21 crc kubenswrapper[4763]: I1205 12:11:21.947322 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kf95k\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:22 crc kubenswrapper[4763]: I1205 12:11:22.055628 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:22 crc kubenswrapper[4763]: I1205 12:11:22.344310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerStarted","Data":"b5727ff14552797bde5c249f4c3a7241dcb5a4c59606a0859c55ae6d71c2727f"} Dec 05 12:11:22 crc kubenswrapper[4763]: I1205 12:11:22.614975 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kf95k"] Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerStarted","Data":"c0c1cacff3942a673b3875097d5fda9fe60e1caafa3bf58392cc2c402541f016"} Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359527 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359155 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-central-agent" containerID="cri-o://5f3f7453d0c45f14c208cbc9f6364423283f1e45c4de1319f57567d584ccc7fd" gracePeriod=30 Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359616 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="proxy-httpd" containerID="cri-o://c0c1cacff3942a673b3875097d5fda9fe60e1caafa3bf58392cc2c402541f016" gracePeriod=30 Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359665 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-notification-agent" containerID="cri-o://bcdf7f43b9c4f2c24523d14abb30c7e2d9b225fd8a89565bfb100aae400014c8" gracePeriod=30 Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.359715 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="sg-core" containerID="cri-o://b5727ff14552797bde5c249f4c3a7241dcb5a4c59606a0859c55ae6d71c2727f" gracePeriod=30 Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.370501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kf95k" event={"ID":"a6e35966-e928-4029-a553-d2624cbf0fd1","Type":"ContainerStarted","Data":"cb3a5184fb691278156fd366996ab5666c7d52fc9c869c6d1223809c571f4735"} Dec 05 12:11:23 crc kubenswrapper[4763]: I1205 12:11:23.388621 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.440237243 podStartE2EDuration="9.388595695s" podCreationTimestamp="2025-12-05 12:11:14 +0000 UTC" firstStartedPulling="2025-12-05 12:11:16.072154379 +0000 UTC m=+1360.564869102" lastFinishedPulling="2025-12-05 12:11:23.020512831 +0000 UTC m=+1367.513227554" observedRunningTime="2025-12-05 12:11:23.377833809 +0000 UTC m=+1367.870548542" watchObservedRunningTime="2025-12-05 12:11:23.388595695 +0000 UTC m=+1367.881310438" Dec 05 12:11:24 crc kubenswrapper[4763]: I1205 12:11:24.425203 4763 generic.go:334] "Generic (PLEG): container finished" podID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerID="b5727ff14552797bde5c249f4c3a7241dcb5a4c59606a0859c55ae6d71c2727f" exitCode=2 Dec 05 12:11:24 crc kubenswrapper[4763]: I1205 12:11:24.425508 4763 generic.go:334] "Generic (PLEG): container finished" podID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerID="bcdf7f43b9c4f2c24523d14abb30c7e2d9b225fd8a89565bfb100aae400014c8" exitCode=0 Dec 05 12:11:24 crc kubenswrapper[4763]: I1205 12:11:24.425403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerDied","Data":"b5727ff14552797bde5c249f4c3a7241dcb5a4c59606a0859c55ae6d71c2727f"} Dec 05 12:11:24 crc kubenswrapper[4763]: I1205 12:11:24.425547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerDied","Data":"bcdf7f43b9c4f2c24523d14abb30c7e2d9b225fd8a89565bfb100aae400014c8"} Dec 05 12:11:25 crc kubenswrapper[4763]: I1205 12:11:25.420894 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:25 crc kubenswrapper[4763]: I1205 12:11:25.434395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c4499b47f-s4mh4" Dec 05 12:11:25 crc kubenswrapper[4763]: I1205 12:11:25.439332 4763 generic.go:334] "Generic (PLEG): container finished" podID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerID="5f3f7453d0c45f14c208cbc9f6364423283f1e45c4de1319f57567d584ccc7fd" exitCode=0 Dec 05 12:11:25 crc kubenswrapper[4763]: I1205 12:11:25.440329 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerDied","Data":"5f3f7453d0c45f14c208cbc9f6364423283f1e45c4de1319f57567d584ccc7fd"} Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.020952 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.397322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.397381 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.434139 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.444339 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.449028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:26 crc kubenswrapper[4763]: I1205 12:11:26.449070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:28 crc kubenswrapper[4763]: I1205 12:11:28.829530 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:28 crc kubenswrapper[4763]: I1205 12:11:28.830566 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:11:28 crc kubenswrapper[4763]: I1205 12:11:28.988537 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 12:11:35 crc kubenswrapper[4763]: I1205 12:11:35.538637 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kf95k" event={"ID":"a6e35966-e928-4029-a553-d2624cbf0fd1","Type":"ContainerStarted","Data":"cf929ad9c01927e617f8bb295262d53a07116145ca27fc231a08472471745adb"} Dec 05 12:11:35 crc kubenswrapper[4763]: I1205 12:11:35.566708 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kf95k" podStartSLOduration=2.367659211 podStartE2EDuration="14.56666355s" podCreationTimestamp="2025-12-05 12:11:21 +0000 UTC" firstStartedPulling="2025-12-05 12:11:22.627115453 +0000 UTC m=+1367.119830176" lastFinishedPulling="2025-12-05 12:11:34.826119792 +0000 UTC m=+1379.318834515" observedRunningTime="2025-12-05 12:11:35.55578689 +0000 UTC m=+1380.048501623" watchObservedRunningTime="2025-12-05 12:11:35.56666355 +0000 UTC m=+1380.059378293" Dec 05 12:11:37 crc kubenswrapper[4763]: I1205 12:11:37.543721 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:11:37 crc kubenswrapper[4763]: I1205 12:11:37.544106 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:11:38 crc kubenswrapper[4763]: I1205 12:11:38.421401 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:38 crc kubenswrapper[4763]: I1205 12:11:38.421942 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" containerName="watcher-decision-engine" containerID="cri-o://e8aa557f91a9fbdce97324156f041bf924b254be96744ed33a919ee6fa4b70b1" gracePeriod=30 Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.635934 4763 generic.go:334] "Generic (PLEG): container finished" podID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" containerID="e8aa557f91a9fbdce97324156f041bf924b254be96744ed33a919ee6fa4b70b1" exitCode=0 Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.636025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c","Type":"ContainerDied","Data":"e8aa557f91a9fbdce97324156f041bf924b254be96744ed33a919ee6fa4b70b1"} Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.831285 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.976741 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj7mh\" (UniqueName: \"kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh\") pod \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.976867 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca\") pod \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.977021 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs\") pod \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.977095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data\") pod \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.977229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle\") pod \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\" (UID: \"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c\") " Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.977373 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs" (OuterVolumeSpecName: "logs") pod "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" (UID: "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.978592 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:43 crc kubenswrapper[4763]: I1205 12:11:43.984803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh" (OuterVolumeSpecName: "kube-api-access-rj7mh") pod "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" (UID: "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c"). InnerVolumeSpecName "kube-api-access-rj7mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.023029 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" (UID: "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.031145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" (UID: "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.056280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data" (OuterVolumeSpecName: "config-data") pod "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" (UID: "423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.080643 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj7mh\" (UniqueName: \"kubernetes.io/projected/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-kube-api-access-rj7mh\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.080683 4763 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.080694 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.080706 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.647404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c","Type":"ContainerDied","Data":"3b889b2787dcd87e2379b3e6711ad8f2860214ae4fd36e92f121a31a6402a3b9"} Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.647460 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.647751 4763 scope.go:117] "RemoveContainer" containerID="e8aa557f91a9fbdce97324156f041bf924b254be96744ed33a919ee6fa4b70b1" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.681291 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.698897 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.707694 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:44 crc kubenswrapper[4763]: E1205 12:11:44.708162 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" containerName="watcher-decision-engine" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.708179 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" containerName="watcher-decision-engine" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.708352 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" containerName="watcher-decision-engine" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.708998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.712304 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.727962 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.795809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.795851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.796047 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-logs\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.796135 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.796310 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshpv\" (UniqueName: \"kubernetes.io/projected/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-kube-api-access-dshpv\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.855112 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.898019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-logs\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.898096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.898173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshpv\" (UniqueName: \"kubernetes.io/projected/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-kube-api-access-dshpv\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.898230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.898247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.899130 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-logs\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.903099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.909408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.909465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:44 crc kubenswrapper[4763]: I1205 12:11:44.930037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshpv\" (UniqueName: \"kubernetes.io/projected/819c6a72-3e2a-4445-8abf-1a10f8eaab9b-kube-api-access-dshpv\") pod \"watcher-decision-engine-0\" (UID: \"819c6a72-3e2a-4445-8abf-1a10f8eaab9b\") " pod="openstack/watcher-decision-engine-0" Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.030323 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.529938 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.659183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"819c6a72-3e2a-4445-8abf-1a10f8eaab9b","Type":"ContainerStarted","Data":"5e89eb8cb8fd3c2ee4d8726a7c3b8e91af2514495308f840422429c7563a8234"} Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.757610 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4ba0cbf0-3e4e-4cb0-82b0-179d11937330"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4ba0cbf0-3e4e-4cb0-82b0-179d11937330] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4ba0cbf0_3e4e_4cb0_82b0_179d11937330.slice" Dec 05 12:11:45 crc kubenswrapper[4763]: E1205 12:11:45.757688 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4ba0cbf0-3e4e-4cb0-82b0-179d11937330] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4ba0cbf0-3e4e-4cb0-82b0-179d11937330] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4ba0cbf0_3e4e_4cb0_82b0_179d11937330.slice" pod="openstack/horizon-7dc58bc884-khdbv" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.815036 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c" path="/var/lib/kubelet/pods/423e9a9a-b5eb-40f7-9d66-996d9ca4fd1c/volumes" Dec 05 12:11:45 crc kubenswrapper[4763]: I1205 12:11:45.917860 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod06f0e50f-35b7-441d-a630-0655b4c1cd00"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod06f0e50f-35b7-441d-a630-0655b4c1cd00] : Timed out while waiting for systemd to remove kubepods-besteffort-pod06f0e50f_35b7_441d_a630_0655b4c1cd00.slice" Dec 05 12:11:45 crc kubenswrapper[4763]: E1205 12:11:45.917925 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod06f0e50f-35b7-441d-a630-0655b4c1cd00] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod06f0e50f-35b7-441d-a630-0655b4c1cd00] : Timed out while waiting for systemd to remove kubepods-besteffort-pod06f0e50f_35b7_441d_a630_0655b4c1cd00.slice" pod="openstack/glance-default-external-api-0" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.668451 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"819c6a72-3e2a-4445-8abf-1a10f8eaab9b","Type":"ContainerStarted","Data":"da2cf7f696eaec62314bcb00f9e4d77ca943c09e0dbe0091a544a9046cb4ff22"} Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.672405 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6e35966-e928-4029-a553-d2624cbf0fd1" containerID="cf929ad9c01927e617f8bb295262d53a07116145ca27fc231a08472471745adb" exitCode=0 Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.672483 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.672473 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kf95k" event={"ID":"a6e35966-e928-4029-a553-d2624cbf0fd1","Type":"ContainerDied","Data":"cf929ad9c01927e617f8bb295262d53a07116145ca27fc231a08472471745adb"} Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.672602 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc58bc884-khdbv" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.686468 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.686448641 podStartE2EDuration="2.686448641s" podCreationTimestamp="2025-12-05 12:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:46.685333015 +0000 UTC m=+1391.178047738" watchObservedRunningTime="2025-12-05 12:11:46.686448641 +0000 UTC m=+1391.179163364" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.722842 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.755200 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dc58bc884-khdbv"] Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.816610 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.841227 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.854797 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.856564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.864223 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.864525 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 12:11:46 crc kubenswrapper[4763]: I1205 12:11:46.868459 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.043508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shcj8\" (UniqueName: \"kubernetes.io/projected/29573883-0e6d-40b3-9a6f-39308d6db246-kube-api-access-shcj8\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.043861 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.043907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.044066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.044152 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-config-data\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.044237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-scripts\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.044365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-logs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.044500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shcj8\" (UniqueName: \"kubernetes.io/projected/29573883-0e6d-40b3-9a6f-39308d6db246-kube-api-access-shcj8\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146573 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146618 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-config-data\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-scripts\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146801 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-logs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.146856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.147343 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.147386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-logs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.147350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29573883-0e6d-40b3-9a6f-39308d6db246-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.153490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.154400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-config-data\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.154438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.161545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29573883-0e6d-40b3-9a6f-39308d6db246-scripts\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.167822 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shcj8\" (UniqueName: \"kubernetes.io/projected/29573883-0e6d-40b3-9a6f-39308d6db246-kube-api-access-shcj8\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.233968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"29573883-0e6d-40b3-9a6f-39308d6db246\") " pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.489933 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.797077 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f0e50f-35b7-441d-a630-0655b4c1cd00" path="/var/lib/kubelet/pods/06f0e50f-35b7-441d-a630-0655b4c1cd00/volumes" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.798112 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba0cbf0-3e4e-4cb0-82b0-179d11937330" path="/var/lib/kubelet/pods/4ba0cbf0-3e4e-4cb0-82b0-179d11937330/volumes" Dec 05 12:11:47 crc kubenswrapper[4763]: I1205 12:11:47.971794 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.063674 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data\") pod \"a6e35966-e928-4029-a553-d2624cbf0fd1\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.063838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle\") pod \"a6e35966-e928-4029-a553-d2624cbf0fd1\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.063950 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8cw4\" (UniqueName: \"kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4\") pod \"a6e35966-e928-4029-a553-d2624cbf0fd1\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.064036 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts\") pod \"a6e35966-e928-4029-a553-d2624cbf0fd1\" (UID: \"a6e35966-e928-4029-a553-d2624cbf0fd1\") " Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.070214 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4" (OuterVolumeSpecName: "kube-api-access-h8cw4") pod "a6e35966-e928-4029-a553-d2624cbf0fd1" (UID: "a6e35966-e928-4029-a553-d2624cbf0fd1"). InnerVolumeSpecName "kube-api-access-h8cw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.071951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts" (OuterVolumeSpecName: "scripts") pod "a6e35966-e928-4029-a553-d2624cbf0fd1" (UID: "a6e35966-e928-4029-a553-d2624cbf0fd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.085553 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.104788 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6e35966-e928-4029-a553-d2624cbf0fd1" (UID: "a6e35966-e928-4029-a553-d2624cbf0fd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.116737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data" (OuterVolumeSpecName: "config-data") pod "a6e35966-e928-4029-a553-d2624cbf0fd1" (UID: "a6e35966-e928-4029-a553-d2624cbf0fd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.166357 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.166396 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.166410 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8cw4\" (UniqueName: \"kubernetes.io/projected/a6e35966-e928-4029-a553-d2624cbf0fd1-kube-api-access-h8cw4\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.166423 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e35966-e928-4029-a553-d2624cbf0fd1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.714447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29573883-0e6d-40b3-9a6f-39308d6db246","Type":"ContainerStarted","Data":"2647b58beb0ac306e46b1c6e665597b846a903bb293e1754fda00ffc72ee5ec2"} Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.714818 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29573883-0e6d-40b3-9a6f-39308d6db246","Type":"ContainerStarted","Data":"9f17b2a528b36a3c20357c66fbd4bded644d00cfa77fd46a6df441b97f10ce6a"} Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.717402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kf95k" event={"ID":"a6e35966-e928-4029-a553-d2624cbf0fd1","Type":"ContainerDied","Data":"cb3a5184fb691278156fd366996ab5666c7d52fc9c869c6d1223809c571f4735"} Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.717449 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3a5184fb691278156fd366996ab5666c7d52fc9c869c6d1223809c571f4735" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.717558 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kf95k" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.849655 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:11:48 crc kubenswrapper[4763]: E1205 12:11:48.850401 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e35966-e928-4029-a553-d2624cbf0fd1" containerName="nova-cell0-conductor-db-sync" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.850420 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e35966-e928-4029-a553-d2624cbf0fd1" containerName="nova-cell0-conductor-db-sync" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.852317 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e35966-e928-4029-a553-d2624cbf0fd1" containerName="nova-cell0-conductor-db-sync" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.853154 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.856252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6prfx" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.856505 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.861519 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.989257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.989502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:48 crc kubenswrapper[4763]: I1205 12:11:48.989616 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjll4\" (UniqueName: \"kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.091535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.091595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.091622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjll4\" (UniqueName: \"kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.100166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.100397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.110305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjll4\" (UniqueName: \"kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4\") pod \"nova-cell0-conductor-0\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.195321 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.730316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29573883-0e6d-40b3-9a6f-39308d6db246","Type":"ContainerStarted","Data":"55b130395368616ad3314826e2487f5a353c2e20bcb19bbe4285189b6a1f655b"} Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.747934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:11:49 crc kubenswrapper[4763]: I1205 12:11:49.767284 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.767264227 podStartE2EDuration="3.767264227s" podCreationTimestamp="2025-12-05 12:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:49.762738052 +0000 UTC m=+1394.255452805" watchObservedRunningTime="2025-12-05 12:11:49.767264227 +0000 UTC m=+1394.259978940" Dec 05 12:11:51 crc kubenswrapper[4763]: I1205 12:11:51.566005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cfb41c12-dade-4239-b67b-742df1922c22","Type":"ContainerStarted","Data":"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf"} Dec 05 12:11:51 crc kubenswrapper[4763]: I1205 12:11:51.566877 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:51 crc kubenswrapper[4763]: I1205 12:11:51.566914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cfb41c12-dade-4239-b67b-742df1922c22","Type":"ContainerStarted","Data":"5f2d1cc02251f6c073ea3a85e88de9e769b4462572283892ec149e09e78273f8"} Dec 05 12:11:51 crc kubenswrapper[4763]: I1205 12:11:51.591828 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.591812576 podStartE2EDuration="3.591812576s" podCreationTimestamp="2025-12-05 12:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:51.587519799 +0000 UTC m=+1396.080234532" watchObservedRunningTime="2025-12-05 12:11:51.591812576 +0000 UTC m=+1396.084527299" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.588559 4763 generic.go:334] "Generic (PLEG): container finished" podID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerID="c0c1cacff3942a673b3875097d5fda9fe60e1caafa3bf58392cc2c402541f016" exitCode=137 Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.588637 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerDied","Data":"c0c1cacff3942a673b3875097d5fda9fe60e1caafa3bf58392cc2c402541f016"} Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.811097 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963560 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963585 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbmhs\" (UniqueName: \"kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.963822 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data\") pod \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\" (UID: \"4aecdd42-0f16-4f9a-91d4-3512b21b3763\") " Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.965130 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.965507 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.971229 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs" (OuterVolumeSpecName: "kube-api-access-mbmhs") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "kube-api-access-mbmhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.972185 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts" (OuterVolumeSpecName: "scripts") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:53 crc kubenswrapper[4763]: I1205 12:11:53.994501 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.044113 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067652 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067704 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067717 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4aecdd42-0f16-4f9a-91d4-3512b21b3763-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067737 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067749 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbmhs\" (UniqueName: \"kubernetes.io/projected/4aecdd42-0f16-4f9a-91d4-3512b21b3763-kube-api-access-mbmhs\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.067780 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.075275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data" (OuterVolumeSpecName: "config-data") pod "4aecdd42-0f16-4f9a-91d4-3512b21b3763" (UID: "4aecdd42-0f16-4f9a-91d4-3512b21b3763"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.169313 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aecdd42-0f16-4f9a-91d4-3512b21b3763-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.232901 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.609570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4aecdd42-0f16-4f9a-91d4-3512b21b3763","Type":"ContainerDied","Data":"e576789f4e0172914bcbb54bc3a1017b4d88d579c2637c3126ecf165fe860801"} Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.610812 4763 scope.go:117] "RemoveContainer" containerID="c0c1cacff3942a673b3875097d5fda9fe60e1caafa3bf58392cc2c402541f016" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.609659 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.630869 4763 scope.go:117] "RemoveContainer" containerID="b5727ff14552797bde5c249f4c3a7241dcb5a4c59606a0859c55ae6d71c2727f" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.651218 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.658221 4763 scope.go:117] "RemoveContainer" containerID="bcdf7f43b9c4f2c24523d14abb30c7e2d9b225fd8a89565bfb100aae400014c8" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.674831 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.690218 4763 scope.go:117] "RemoveContainer" containerID="5f3f7453d0c45f14c208cbc9f6364423283f1e45c4de1319f57567d584ccc7fd" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.691610 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:54 crc kubenswrapper[4763]: E1205 12:11:54.692440 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="sg-core" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.692508 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="sg-core" Dec 05 12:11:54 crc kubenswrapper[4763]: E1205 12:11:54.692548 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="proxy-httpd" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.692557 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="proxy-httpd" Dec 05 12:11:54 crc kubenswrapper[4763]: E1205 12:11:54.692685 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-notification-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.692701 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-notification-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: E1205 12:11:54.692723 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-central-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.692733 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-central-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.693522 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="proxy-httpd" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.693550 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-notification-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.693595 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="sg-core" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.693609 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" containerName="ceilometer-central-agent" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.696393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.700415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.700746 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.701691 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.706136 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjjp\" (UniqueName: \"kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.782974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.884646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885285 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjjp\" (UniqueName: \"kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885672 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885920 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.884823 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h96f9"] Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.889036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.885608 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.889249 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.891946 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.892091 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.893385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.895433 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.897701 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.898794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.900175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.913041 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h96f9"] Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.920576 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjjp\" (UniqueName: \"kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp\") pod \"ceilometer-0\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " pod="openstack/ceilometer-0" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.987362 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.987439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tssrd\" (UniqueName: \"kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.987605 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:54 crc kubenswrapper[4763]: I1205 12:11:54.987736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.020504 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.031527 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.090428 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.090578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.090634 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.090728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.090794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tssrd\" (UniqueName: \"kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.098552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.099566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.106887 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.131545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.197012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.215213 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tssrd\" (UniqueName: \"kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd\") pod \"nova-cell0-cell-mapping-h96f9\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.233699 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.283905 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.285991 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.287620 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.296486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.301046 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.317893 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.320195 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.322175 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.322298 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9g6q\" (UniqueName: \"kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.322494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.324807 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.342073 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.439708 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kch\" (UniqueName: \"kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.439772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.439879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.439918 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.439938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9cw\" (UniqueName: \"kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440029 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440060 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440145 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.440263 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9g6q\" (UniqueName: \"kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.441120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.441394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.472718 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.479520 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.510590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9g6q\" (UniqueName: \"kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q\") pod \"redhat-operators-cqhd5\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.517176 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.520110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.535347 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.538308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.543900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.543954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.543992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9kch\" (UniqueName: \"kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544009 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9cw\" (UniqueName: \"kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544225 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544260 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.544311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2vwj\" (UniqueName: \"kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.553208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.553297 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.553348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.554809 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.559058 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.560968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.561854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.594705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.594822 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.605472 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9kch\" (UniqueName: \"kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch\") pod \"nova-api-0\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.612264 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.616514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9cw\" (UniqueName: \"kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw\") pod \"nova-metadata-0\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.622062 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.657852 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.658721 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659100 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2vwj\" (UniqueName: \"kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659132 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659180 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659214 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l96m\" (UniqueName: \"kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659236 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.659364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.660241 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.660818 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.663250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.668359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.668561 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.695818 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.700780 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.701895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2vwj\" (UniqueName: \"kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj\") pod \"dnsmasq-dns-bccf8f775-xwlmv\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.705251 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.712395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.731777 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.735355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761344 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j89n7\" (UniqueName: \"kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761689 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.761834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l96m\" (UniqueName: \"kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.772194 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.776061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.803338 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l96m\" (UniqueName: \"kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m\") pod \"nova-scheduler-0\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " pod="openstack/nova-scheduler-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.826266 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aecdd42-0f16-4f9a-91d4-3512b21b3763" path="/var/lib/kubelet/pods/4aecdd42-0f16-4f9a-91d4-3512b21b3763/volumes" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.866087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.866188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j89n7\" (UniqueName: \"kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.866258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.876587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.877777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:55 crc kubenswrapper[4763]: I1205 12:11:55.904302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j89n7\" (UniqueName: \"kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.063524 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.078221 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.128705 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.316438 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h96f9"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.402564 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wxg7t"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.404000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.408200 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.408401 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.436303 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wxg7t"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.492108 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.492401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nhh\" (UniqueName: \"kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.492453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.492485 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.594785 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.596420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.596483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nhh\" (UniqueName: \"kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.596544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.596580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.605826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.613390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.672542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nhh\" (UniqueName: \"kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.694047 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts\") pod \"nova-cell1-conductor-db-sync-wxg7t\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.763280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h96f9" event={"ID":"57820ba4-dfb8-40a5-be44-26fc6fe01967","Type":"ContainerStarted","Data":"638d7fd69b08ed888fc03240e0d7e6e18e109da92c8fa6411dd97305f6d77c5d"} Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.793682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerStarted","Data":"a8bd5f339eb36a32903aa4adf76f4f4688eff68967f9293a4af7f5e37cb0300e"} Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.805375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.820530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerStarted","Data":"60b692fdf536497150e24a537d970d95f6bdae69857aca490c4d63d3addf8631"} Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.841122 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:11:56 crc kubenswrapper[4763]: I1205 12:11:56.881802 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.103231 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.495083 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.495374 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.570109 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.617968 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.663244 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.664567 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.837854 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wxg7t"] Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.881598 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerStarted","Data":"365e0c3c83c18fb225b79a967874009f46bf1c6a51cce3f434412a5649294b84"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.883170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" event={"ID":"ce666e93-3a67-456a-ad40-fd30c7ed0f7f","Type":"ContainerStarted","Data":"5806ee48915675b20562c8f2ff880944acfb3f636608a253dcddb140083ef134"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.894343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerStarted","Data":"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.899866 4763 generic.go:334] "Generic (PLEG): container finished" podID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerID="119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26" exitCode=0 Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.899951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerDied","Data":"119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.899992 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerStarted","Data":"342c27b3a6bcd5363e52c8cadad03b14ac83d5623e6864ef938fdcb987e5d356"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.945551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h96f9" event={"ID":"57820ba4-dfb8-40a5-be44-26fc6fe01967","Type":"ContainerStarted","Data":"93c63cdbc34e09e7e9ddd7320caa2e3d56e27ba691af3b7e735782f7c54bbb57"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.969635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"885001ef-6b9f-4d63-a690-fb3c78b7e037","Type":"ContainerStarted","Data":"d9778779680b20f9b1365ea78e6686acb587d64684baaa9d01a6d02418be42eb"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.997146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0","Type":"ContainerStarted","Data":"f44fd6654fe30634d3233f2eece391e2f90333b149d23800931ac7309ba0254e"} Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.999098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:11:57 crc kubenswrapper[4763]: I1205 12:11:57.999258 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 12:11:58 crc kubenswrapper[4763]: I1205 12:11:58.012123 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h96f9" podStartSLOduration=4.012047983 podStartE2EDuration="4.012047983s" podCreationTimestamp="2025-12-05 12:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:57.969957651 +0000 UTC m=+1402.462672374" watchObservedRunningTime="2025-12-05 12:11:58.012047983 +0000 UTC m=+1402.504762726" Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.025668 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerStarted","Data":"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4"} Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.040157 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" event={"ID":"4f5732ce-fb71-4847-a977-763074d671f6","Type":"ContainerStarted","Data":"8afaf76e12309847b9eb9ad71c6b9943189539680889d714ecd20a188921ccf8"} Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.040206 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" event={"ID":"4f5732ce-fb71-4847-a977-763074d671f6","Type":"ContainerStarted","Data":"fe4b923ab1242533d80f5ef49daa41f1798130272070abab5566cbba1d069298"} Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.050665 4763 generic.go:334] "Generic (PLEG): container finished" podID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerID="c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9" exitCode=0 Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.051981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" event={"ID":"ce666e93-3a67-456a-ad40-fd30c7ed0f7f","Type":"ContainerDied","Data":"c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9"} Dec 05 12:11:59 crc kubenswrapper[4763]: I1205 12:11:59.098814 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" podStartSLOduration=3.098794443 podStartE2EDuration="3.098794443s" podCreationTimestamp="2025-12-05 12:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:11:59.065155303 +0000 UTC m=+1403.557870036" watchObservedRunningTime="2025-12-05 12:11:59.098794443 +0000 UTC m=+1403.591509156" Dec 05 12:12:00 crc kubenswrapper[4763]: I1205 12:12:00.030033 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:00 crc kubenswrapper[4763]: I1205 12:12:00.086290 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:12:00 crc kubenswrapper[4763]: I1205 12:12:00.086322 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:12:00 crc kubenswrapper[4763]: I1205 12:12:00.087526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerStarted","Data":"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866"} Dec 05 12:12:00 crc kubenswrapper[4763]: I1205 12:12:00.167899 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:01 crc kubenswrapper[4763]: I1205 12:12:01.100272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerStarted","Data":"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6"} Dec 05 12:12:01 crc kubenswrapper[4763]: I1205 12:12:01.104228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" event={"ID":"ce666e93-3a67-456a-ad40-fd30c7ed0f7f","Type":"ContainerStarted","Data":"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932"} Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.121687 4763 generic.go:334] "Generic (PLEG): container finished" podID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerID="6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866" exitCode=0 Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.121750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerDied","Data":"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866"} Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.122239 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.145248 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" podStartSLOduration=7.145228823 podStartE2EDuration="7.145228823s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:01.132338165 +0000 UTC m=+1405.625052888" watchObservedRunningTime="2025-12-05 12:12:02.145228823 +0000 UTC m=+1406.637943546" Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.631698 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.631834 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:12:02 crc kubenswrapper[4763]: I1205 12:12:02.725562 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 12:12:04 crc kubenswrapper[4763]: I1205 12:12:04.286562 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:04 crc kubenswrapper[4763]: I1205 12:12:04.309519 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:04 crc kubenswrapper[4763]: I1205 12:12:04.330308 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:04 crc kubenswrapper[4763]: I1205 12:12:04.330553 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="cfb41c12-dade-4239-b67b-742df1922c22" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf" gracePeriod=30 Dec 05 12:12:05 crc kubenswrapper[4763]: I1205 12:12:05.586371 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:05 crc kubenswrapper[4763]: I1205 12:12:05.737946 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:12:05 crc kubenswrapper[4763]: I1205 12:12:05.910036 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:12:05 crc kubenswrapper[4763]: I1205 12:12:05.910818 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="dnsmasq-dns" containerID="cri-o://ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714" gracePeriod=10 Dec 05 12:12:06 crc kubenswrapper[4763]: I1205 12:12:06.485251 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: connect: connection refused" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.141418 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.243156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0","Type":"ContainerStarted","Data":"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.243370 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.263104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerStarted","Data":"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.263155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerStarted","Data":"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.263263 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-log" containerID="cri-o://c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.263510 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-metadata" containerID="cri-o://a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.292602 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerStarted","Data":"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.292660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerStarted","Data":"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.293382 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-log" containerID="cri-o://c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.293683 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-api" containerID="cri-o://76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.299438 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.537354121 podStartE2EDuration="12.29940089s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="2025-12-05 12:11:57.710847318 +0000 UTC m=+1402.203562041" lastFinishedPulling="2025-12-05 12:12:05.472894087 +0000 UTC m=+1409.965608810" observedRunningTime="2025-12-05 12:12:07.286068662 +0000 UTC m=+1411.778783385" watchObservedRunningTime="2025-12-05 12:12:07.29940089 +0000 UTC m=+1411.792115623" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.314192 4763 generic.go:334] "Generic (PLEG): container finished" podID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerID="ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714" exitCode=0 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.314269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerDied","Data":"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.314295 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" event={"ID":"3a2c6ddd-063a-4531-9458-9de82a61d9ed","Type":"ContainerDied","Data":"fdba6952113c6c55be16e4919e6e7ca3dcc7b9d95e1e783c3e151e7634c2c508"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.314312 4763 scope.go:117] "RemoveContainer" containerID="ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.314440 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zhrbr" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.340544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerStarted","Data":"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.340693 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.340749 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-central-agent" containerID="cri-o://a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.340893 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.340984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gcmc\" (UniqueName: \"kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341060 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341100 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="proxy-httpd" containerID="cri-o://bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341141 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="sg-core" containerID="cri-o://c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341197 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-notification-agent" containerID="cri-o://775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.341474 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0\") pod \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\" (UID: \"3a2c6ddd-063a-4531-9458-9de82a61d9ed\") " Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.344300 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.761719926 podStartE2EDuration="12.344275982s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="2025-12-05 12:11:56.921868174 +0000 UTC m=+1401.414582897" lastFinishedPulling="2025-12-05 12:12:05.50442423 +0000 UTC m=+1409.997138953" observedRunningTime="2025-12-05 12:12:07.313980649 +0000 UTC m=+1411.806695372" watchObservedRunningTime="2025-12-05 12:12:07.344275982 +0000 UTC m=+1411.836990705" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.377989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc" (OuterVolumeSpecName: "kube-api-access-9gcmc") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "kube-api-access-9gcmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.378211 4763 scope.go:117] "RemoveContainer" containerID="039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.418380 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerStarted","Data":"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.444570 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config" (OuterVolumeSpecName: "config") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.451200 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.451227 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gcmc\" (UniqueName: \"kubernetes.io/projected/3a2c6ddd-063a-4531-9458-9de82a61d9ed-kube-api-access-9gcmc\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.470162 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"885001ef-6b9f-4d63-a690-fb3c78b7e037","Type":"ContainerStarted","Data":"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74"} Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.470335 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="885001ef-6b9f-4d63-a690-fb3c78b7e037" containerName="nova-scheduler-scheduler" containerID="cri-o://91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74" gracePeriod=30 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.487033 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.649053477 podStartE2EDuration="12.487007807s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="2025-12-05 12:11:56.641974313 +0000 UTC m=+1401.134689036" lastFinishedPulling="2025-12-05 12:12:05.479928643 +0000 UTC m=+1409.972643366" observedRunningTime="2025-12-05 12:12:07.363248241 +0000 UTC m=+1411.855962984" watchObservedRunningTime="2025-12-05 12:12:07.487007807 +0000 UTC m=+1411.979722540" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.517727 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.198222715 podStartE2EDuration="13.517704793s" podCreationTimestamp="2025-12-05 12:11:54 +0000 UTC" firstStartedPulling="2025-12-05 12:11:56.19546795 +0000 UTC m=+1400.688182673" lastFinishedPulling="2025-12-05 12:12:05.514950028 +0000 UTC m=+1410.007664751" observedRunningTime="2025-12-05 12:12:07.458269854 +0000 UTC m=+1411.950984587" watchObservedRunningTime="2025-12-05 12:12:07.517704793 +0000 UTC m=+1412.010419526" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.520814 4763 scope.go:117] "RemoveContainer" containerID="ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714" Dec 05 12:12:07 crc kubenswrapper[4763]: E1205 12:12:07.521268 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714\": container with ID starting with ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714 not found: ID does not exist" containerID="ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.521303 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714"} err="failed to get container status \"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714\": rpc error: code = NotFound desc = could not find container \"ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714\": container with ID starting with ee3c7990638dea49d81fd6e670f80b32d98bd160a63cb2edf36972cdb79ec714 not found: ID does not exist" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.521329 4763 scope.go:117] "RemoveContainer" containerID="039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42" Dec 05 12:12:07 crc kubenswrapper[4763]: E1205 12:12:07.521884 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42\": container with ID starting with 039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42 not found: ID does not exist" containerID="039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.521906 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42"} err="failed to get container status \"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42\": rpc error: code = NotFound desc = could not find container \"039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42\": container with ID starting with 039aa352c03484d5653a10b36bb34e31a038004cd48c8c1420d430a25a4a4d42 not found: ID does not exist" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.535461 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.542465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.543700 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.543753 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.543910 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.544838 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.544917 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539" gracePeriod=600 Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.554296 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.554342 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.558624 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cqhd5" podStartSLOduration=4.9473235209999995 podStartE2EDuration="12.558580536s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="2025-12-05 12:11:57.905070917 +0000 UTC m=+1402.397785640" lastFinishedPulling="2025-12-05 12:12:05.516327932 +0000 UTC m=+1410.009042655" observedRunningTime="2025-12-05 12:12:07.510425329 +0000 UTC m=+1412.003140062" watchObservedRunningTime="2025-12-05 12:12:07.558580536 +0000 UTC m=+1412.051295279" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.563262 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.597258 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.718602733 podStartE2EDuration="12.597234448s" podCreationTimestamp="2025-12-05 12:11:55 +0000 UTC" firstStartedPulling="2025-12-05 12:11:57.635035892 +0000 UTC m=+1402.127750615" lastFinishedPulling="2025-12-05 12:12:05.513667607 +0000 UTC m=+1410.006382330" observedRunningTime="2025-12-05 12:12:07.55527328 +0000 UTC m=+1412.047988003" watchObservedRunningTime="2025-12-05 12:12:07.597234448 +0000 UTC m=+1412.089949171" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.604463 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a2c6ddd-063a-4531-9458-9de82a61d9ed" (UID: "3a2c6ddd-063a-4531-9458-9de82a61d9ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.656550 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.656602 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2c6ddd-063a-4531-9458-9de82a61d9ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:07 crc kubenswrapper[4763]: I1205 12:12:07.996929 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.006740 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zhrbr"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.109197 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.277222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs\") pod \"ac9665c6-165e-463d-a82c-219519585c09\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.277457 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle\") pod \"ac9665c6-165e-463d-a82c-219519585c09\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.277516 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9kch\" (UniqueName: \"kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch\") pod \"ac9665c6-165e-463d-a82c-219519585c09\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.277602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data\") pod \"ac9665c6-165e-463d-a82c-219519585c09\" (UID: \"ac9665c6-165e-463d-a82c-219519585c09\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.280266 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs" (OuterVolumeSpecName: "logs") pod "ac9665c6-165e-463d-a82c-219519585c09" (UID: "ac9665c6-165e-463d-a82c-219519585c09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.292520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch" (OuterVolumeSpecName: "kube-api-access-j9kch") pod "ac9665c6-165e-463d-a82c-219519585c09" (UID: "ac9665c6-165e-463d-a82c-219519585c09"). InnerVolumeSpecName "kube-api-access-j9kch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.336648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data" (OuterVolumeSpecName: "config-data") pod "ac9665c6-165e-463d-a82c-219519585c09" (UID: "ac9665c6-165e-463d-a82c-219519585c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.360696 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac9665c6-165e-463d-a82c-219519585c09" (UID: "ac9665c6-165e-463d-a82c-219519585c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.381794 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.381838 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9kch\" (UniqueName: \"kubernetes.io/projected/ac9665c6-165e-463d-a82c-219519585c09-kube-api-access-j9kch\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.381853 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9665c6-165e-463d-a82c-219519585c09-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.381864 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9665c6-165e-463d-a82c-219519585c09-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.401550 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.460832 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490034 4763 generic.go:334] "Generic (PLEG): container finished" podID="d6f44a99-a146-4222-9f16-ee6903e84462" containerID="a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490087 4763 generic.go:334] "Generic (PLEG): container finished" podID="d6f44a99-a146-4222-9f16-ee6903e84462" containerID="c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" exitCode=143 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerDied","Data":"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490193 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerDied","Data":"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490212 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6f44a99-a146-4222-9f16-ee6903e84462","Type":"ContainerDied","Data":"365e0c3c83c18fb225b79a967874009f46bf1c6a51cce3f434412a5649294b84"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490238 4763 scope.go:117] "RemoveContainer" containerID="a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.490426 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.505015 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.505120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.505148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509300 4763 generic.go:334] "Generic (PLEG): container finished" podID="ac9665c6-165e-463d-a82c-219519585c09" containerID="76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509339 4763 generic.go:334] "Generic (PLEG): container finished" podID="ac9665c6-165e-463d-a82c-219519585c09" containerID="c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" exitCode=143 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerDied","Data":"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerDied","Data":"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509443 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac9665c6-165e-463d-a82c-219519585c09","Type":"ContainerDied","Data":"a8bd5f339eb36a32903aa4adf76f4f4688eff68967f9293a4af7f5e37cb0300e"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.509513 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527639 4763 generic.go:334] "Generic (PLEG): container finished" podID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerID="bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527698 4763 generic.go:334] "Generic (PLEG): container finished" podID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerID="c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6" exitCode=2 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527709 4763 generic.go:334] "Generic (PLEG): container finished" podID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerID="775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerDied","Data":"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527864 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerDied","Data":"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.527881 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerDied","Data":"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.538322 4763 generic.go:334] "Generic (PLEG): container finished" podID="cfb41c12-dade-4239-b67b-742df1922c22" containerID="7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf" exitCode=0 Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.538378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cfb41c12-dade-4239-b67b-742df1922c22","Type":"ContainerDied","Data":"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.538410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cfb41c12-dade-4239-b67b-742df1922c22","Type":"ContainerDied","Data":"5f2d1cc02251f6c073ea3a85e88de9e769b4462572283892ec149e09e78273f8"} Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.538476 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585093 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle\") pod \"cfb41c12-dade-4239-b67b-742df1922c22\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585178 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data\") pod \"d6f44a99-a146-4222-9f16-ee6903e84462\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjll4\" (UniqueName: \"kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4\") pod \"cfb41c12-dade-4239-b67b-742df1922c22\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585263 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data\") pod \"cfb41c12-dade-4239-b67b-742df1922c22\" (UID: \"cfb41c12-dade-4239-b67b-742df1922c22\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585371 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9cw\" (UniqueName: \"kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw\") pod \"d6f44a99-a146-4222-9f16-ee6903e84462\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs\") pod \"d6f44a99-a146-4222-9f16-ee6903e84462\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.585438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle\") pod \"d6f44a99-a146-4222-9f16-ee6903e84462\" (UID: \"d6f44a99-a146-4222-9f16-ee6903e84462\") " Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.587298 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs" (OuterVolumeSpecName: "logs") pod "d6f44a99-a146-4222-9f16-ee6903e84462" (UID: "d6f44a99-a146-4222-9f16-ee6903e84462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.628903 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.631996 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4" (OuterVolumeSpecName: "kube-api-access-xjll4") pod "cfb41c12-dade-4239-b67b-742df1922c22" (UID: "cfb41c12-dade-4239-b67b-742df1922c22"). InnerVolumeSpecName "kube-api-access-xjll4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.641805 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.643990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw" (OuterVolumeSpecName: "kube-api-access-7f9cw") pod "d6f44a99-a146-4222-9f16-ee6903e84462" (UID: "d6f44a99-a146-4222-9f16-ee6903e84462"). InnerVolumeSpecName "kube-api-access-7f9cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.663855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data" (OuterVolumeSpecName: "config-data") pod "d6f44a99-a146-4222-9f16-ee6903e84462" (UID: "d6f44a99-a146-4222-9f16-ee6903e84462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.685556 4763 scope.go:117] "RemoveContainer" containerID="c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.687388 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjll4\" (UniqueName: \"kubernetes.io/projected/cfb41c12-dade-4239-b67b-742df1922c22-kube-api-access-xjll4\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.687412 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9cw\" (UniqueName: \"kubernetes.io/projected/d6f44a99-a146-4222-9f16-ee6903e84462-kube-api-access-7f9cw\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.687424 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f44a99-a146-4222-9f16-ee6903e84462-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.687437 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.730705 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731239 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-log" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731267 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-log" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731286 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-metadata" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731294 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-metadata" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731312 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="init" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731321 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="init" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731340 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-api" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731348 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-api" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731372 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb41c12-dade-4239-b67b-742df1922c22" containerName="nova-cell0-conductor-conductor" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731381 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb41c12-dade-4239-b67b-742df1922c22" containerName="nova-cell0-conductor-conductor" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731393 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-log" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-log" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.731428 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="dnsmasq-dns" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731435 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="dnsmasq-dns" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731670 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" containerName="dnsmasq-dns" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731699 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-log" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731713 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" containerName="nova-metadata-metadata" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731724 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-log" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731739 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb41c12-dade-4239-b67b-742df1922c22" containerName="nova-cell0-conductor-conductor" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.731752 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac9665c6-165e-463d-a82c-219519585c09" containerName="nova-api-api" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.733011 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb41c12-dade-4239-b67b-742df1922c22" (UID: "cfb41c12-dade-4239-b67b-742df1922c22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.733162 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.738556 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.754749 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.756737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f44a99-a146-4222-9f16-ee6903e84462" (UID: "d6f44a99-a146-4222-9f16-ee6903e84462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.769640 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data" (OuterVolumeSpecName: "config-data") pod "cfb41c12-dade-4239-b67b-742df1922c22" (UID: "cfb41c12-dade-4239-b67b-742df1922c22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.789636 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f44a99-a146-4222-9f16-ee6903e84462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.789663 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.789673 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb41c12-dade-4239-b67b-742df1922c22-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.846451 4763 scope.go:117] "RemoveContainer" containerID="a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.851845 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67\": container with ID starting with a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67 not found: ID does not exist" containerID="a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.851886 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67"} err="failed to get container status \"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67\": rpc error: code = NotFound desc = could not find container \"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67\": container with ID starting with a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.851910 4763 scope.go:117] "RemoveContainer" containerID="c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.853350 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72\": container with ID starting with c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72 not found: ID does not exist" containerID="c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.853379 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72"} err="failed to get container status \"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72\": rpc error: code = NotFound desc = could not find container \"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72\": container with ID starting with c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.853399 4763 scope.go:117] "RemoveContainer" containerID="a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.856018 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67"} err="failed to get container status \"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67\": rpc error: code = NotFound desc = could not find container \"a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67\": container with ID starting with a4df2f72dfd3068cca31d756399b5d8a1b73764fb30750a8968a55abf8c69c67 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.856056 4763 scope.go:117] "RemoveContainer" containerID="c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.858981 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72"} err="failed to get container status \"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72\": rpc error: code = NotFound desc = could not find container \"c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72\": container with ID starting with c9d8324306be45298634213edc3afa6e34da774f44e9dc80bf75566e1ef0dc72 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.859009 4763 scope.go:117] "RemoveContainer" containerID="6e6e3cfeab8af452b7eac351a2125ef9c911ea4fcd52b1f8631b40c9322e72b2" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.885910 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.891730 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.891927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.892908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.892950 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8z8\" (UniqueName: \"kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.902650 4763 scope.go:117] "RemoveContainer" containerID="76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.917850 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.945055 4763 scope.go:117] "RemoveContainer" containerID="c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.978949 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.984730 4763 scope.go:117] "RemoveContainer" containerID="76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.985191 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9\": container with ID starting with 76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9 not found: ID does not exist" containerID="76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985219 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9"} err="failed to get container status \"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9\": rpc error: code = NotFound desc = could not find container \"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9\": container with ID starting with 76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985237 4763 scope.go:117] "RemoveContainer" containerID="c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" Dec 05 12:12:08 crc kubenswrapper[4763]: E1205 12:12:08.985462 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137\": container with ID starting with c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137 not found: ID does not exist" containerID="c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985476 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137"} err="failed to get container status \"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137\": rpc error: code = NotFound desc = could not find container \"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137\": container with ID starting with c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985487 4763 scope.go:117] "RemoveContainer" containerID="76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985730 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9"} err="failed to get container status \"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9\": rpc error: code = NotFound desc = could not find container \"76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9\": container with ID starting with 76d03331525ae53c36fba6e4967d37ce848c1aca270c26127a3f06478cc85ac9 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985749 4763 scope.go:117] "RemoveContainer" containerID="c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985943 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137"} err="failed to get container status \"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137\": rpc error: code = NotFound desc = could not find container \"c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137\": container with ID starting with c448715aac5b93f79f80de6d05faca43cb0adb95535fedf5d0c2a3642261d137 not found: ID does not exist" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.985959 4763 scope.go:117] "RemoveContainer" containerID="7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.994070 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.994105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8z8\" (UniqueName: \"kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.994173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.994193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:08 crc kubenswrapper[4763]: I1205 12:12:08.995495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.003422 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.007407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.016963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8z8\" (UniqueName: \"kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8\") pod \"nova-api-0\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " pod="openstack/nova-api-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.035311 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.036699 4763 scope.go:117] "RemoveContainer" containerID="7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf" Dec 05 12:12:09 crc kubenswrapper[4763]: E1205 12:12:09.038149 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf\": container with ID starting with 7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf not found: ID does not exist" containerID="7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.038179 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf"} err="failed to get container status \"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf\": rpc error: code = NotFound desc = could not find container \"7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf\": container with ID starting with 7741c6f62d667ab5fe3d8ba5897f2b55af40da80a5676c60121c34d0a2d911bf not found: ID does not exist" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.044255 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.046055 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.048497 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.048650 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.066561 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.075835 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.077621 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.080098 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.081918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.151306 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.197781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfk2d\" (UniqueName: \"kubernetes.io/projected/4319bf0f-65c6-401b-96dd-53e10a73c011-kube-api-access-rfk2d\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.197939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.197974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.198021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.198049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.198095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kks2m\" (UniqueName: \"kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.198170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.198214 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfk2d\" (UniqueName: \"kubernetes.io/projected/4319bf0f-65c6-401b-96dd-53e10a73c011-kube-api-access-rfk2d\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kks2m\" (UniqueName: \"kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.302536 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.304976 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.306158 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.310258 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.317512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.322088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.322724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4319bf0f-65c6-401b-96dd-53e10a73c011-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.323088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.330448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfk2d\" (UniqueName: \"kubernetes.io/projected/4319bf0f-65c6-401b-96dd-53e10a73c011-kube-api-access-rfk2d\") pod \"nova-cell0-conductor-0\" (UID: \"4319bf0f-65c6-401b-96dd-53e10a73c011\") " pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.363518 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kks2m\" (UniqueName: \"kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m\") pod \"nova-metadata-0\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.367973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.405722 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.633374 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.830026 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2c6ddd-063a-4531-9458-9de82a61d9ed" path="/var/lib/kubelet/pods/3a2c6ddd-063a-4531-9458-9de82a61d9ed/volumes" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.832671 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac9665c6-165e-463d-a82c-219519585c09" path="/var/lib/kubelet/pods/ac9665c6-165e-463d-a82c-219519585c09/volumes" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.836260 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb41c12-dade-4239-b67b-742df1922c22" path="/var/lib/kubelet/pods/cfb41c12-dade-4239-b67b-742df1922c22/volumes" Dec 05 12:12:09 crc kubenswrapper[4763]: I1205 12:12:09.839618 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f44a99-a146-4222-9f16-ee6903e84462" path="/var/lib/kubelet/pods/d6f44a99-a146-4222-9f16-ee6903e84462/volumes" Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.257965 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.369768 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.664869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerStarted","Data":"f8b8d9db9733148167c27c0e8f4d4b296c012c3671c23b675a2d8e14d739f273"} Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.678256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerStarted","Data":"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969"} Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.678299 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerStarted","Data":"21fdb623b96907e9f782b5715ebe1e9932223bcaf33ab8cfb75a4a8e93e322c1"} Dec 05 12:12:10 crc kubenswrapper[4763]: I1205 12:12:10.679772 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4319bf0f-65c6-401b-96dd-53e10a73c011","Type":"ContainerStarted","Data":"97868eea85e480395489d0e3015ec3b0da1e2bea879b13157df720e8c8f16b91"} Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.065457 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.078727 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.692528 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerStarted","Data":"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22"} Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.692593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerStarted","Data":"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe"} Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.694525 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerStarted","Data":"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921"} Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.696647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4319bf0f-65c6-401b-96dd-53e10a73c011","Type":"ContainerStarted","Data":"b267286671292212fa17e49d2b43cbd2e42d7590ededa233eba2fdf78351e4e1"} Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.696862 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.719107 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.719089303 podStartE2EDuration="3.719089303s" podCreationTimestamp="2025-12-05 12:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:11.712033727 +0000 UTC m=+1416.204748440" watchObservedRunningTime="2025-12-05 12:12:11.719089303 +0000 UTC m=+1416.211804026" Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.741393 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.741371719 podStartE2EDuration="3.741371719s" podCreationTimestamp="2025-12-05 12:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:11.737555896 +0000 UTC m=+1416.230270619" watchObservedRunningTime="2025-12-05 12:12:11.741371719 +0000 UTC m=+1416.234086442" Dec 05 12:12:11 crc kubenswrapper[4763]: I1205 12:12:11.769974 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.769943947 podStartE2EDuration="3.769943947s" podCreationTimestamp="2025-12-05 12:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:11.757101164 +0000 UTC m=+1416.249815897" watchObservedRunningTime="2025-12-05 12:12:11.769943947 +0000 UTC m=+1416.262658670" Dec 05 12:12:12 crc kubenswrapper[4763]: I1205 12:12:12.708659 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f5732ce-fb71-4847-a977-763074d671f6" containerID="8afaf76e12309847b9eb9ad71c6b9943189539680889d714ecd20a188921ccf8" exitCode=0 Dec 05 12:12:12 crc kubenswrapper[4763]: I1205 12:12:12.709908 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" event={"ID":"4f5732ce-fb71-4847-a977-763074d671f6","Type":"ContainerDied","Data":"8afaf76e12309847b9eb9ad71c6b9943189539680889d714ecd20a188921ccf8"} Dec 05 12:12:13 crc kubenswrapper[4763]: I1205 12:12:13.724903 4763 generic.go:334] "Generic (PLEG): container finished" podID="57820ba4-dfb8-40a5-be44-26fc6fe01967" containerID="93c63cdbc34e09e7e9ddd7320caa2e3d56e27ba691af3b7e735782f7c54bbb57" exitCode=0 Dec 05 12:12:13 crc kubenswrapper[4763]: I1205 12:12:13.725013 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h96f9" event={"ID":"57820ba4-dfb8-40a5-be44-26fc6fe01967","Type":"ContainerDied","Data":"93c63cdbc34e09e7e9ddd7320caa2e3d56e27ba691af3b7e735782f7c54bbb57"} Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.160732 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.240873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts\") pod \"4f5732ce-fb71-4847-a977-763074d671f6\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.241003 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle\") pod \"4f5732ce-fb71-4847-a977-763074d671f6\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.241096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data\") pod \"4f5732ce-fb71-4847-a977-763074d671f6\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.241258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5nhh\" (UniqueName: \"kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh\") pod \"4f5732ce-fb71-4847-a977-763074d671f6\" (UID: \"4f5732ce-fb71-4847-a977-763074d671f6\") " Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.251326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts" (OuterVolumeSpecName: "scripts") pod "4f5732ce-fb71-4847-a977-763074d671f6" (UID: "4f5732ce-fb71-4847-a977-763074d671f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.253940 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh" (OuterVolumeSpecName: "kube-api-access-g5nhh") pod "4f5732ce-fb71-4847-a977-763074d671f6" (UID: "4f5732ce-fb71-4847-a977-763074d671f6"). InnerVolumeSpecName "kube-api-access-g5nhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.280192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f5732ce-fb71-4847-a977-763074d671f6" (UID: "4f5732ce-fb71-4847-a977-763074d671f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.301894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data" (OuterVolumeSpecName: "config-data") pod "4f5732ce-fb71-4847-a977-763074d671f6" (UID: "4f5732ce-fb71-4847-a977-763074d671f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.345099 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5nhh\" (UniqueName: \"kubernetes.io/projected/4f5732ce-fb71-4847-a977-763074d671f6-kube-api-access-g5nhh\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.345143 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.345154 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.345162 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5732ce-fb71-4847-a977-763074d671f6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.369446 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.369579 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.735393 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.739045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wxg7t" event={"ID":"4f5732ce-fb71-4847-a977-763074d671f6","Type":"ContainerDied","Data":"fe4b923ab1242533d80f5ef49daa41f1798130272070abab5566cbba1d069298"} Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.739110 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe4b923ab1242533d80f5ef49daa41f1798130272070abab5566cbba1d069298" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.818345 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 12:12:14 crc kubenswrapper[4763]: E1205 12:12:14.818961 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5732ce-fb71-4847-a977-763074d671f6" containerName="nova-cell1-conductor-db-sync" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.819031 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5732ce-fb71-4847-a977-763074d671f6" containerName="nova-cell1-conductor-db-sync" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.819262 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5732ce-fb71-4847-a977-763074d671f6" containerName="nova-cell1-conductor-db-sync" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.820050 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.822930 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.831562 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.862110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.862313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd2lk\" (UniqueName: \"kubernetes.io/projected/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-kube-api-access-cd2lk\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.862641 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.964230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.964414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.964455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd2lk\" (UniqueName: \"kubernetes.io/projected/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-kube-api-access-cd2lk\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.972092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.975246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:14 crc kubenswrapper[4763]: I1205 12:12:14.986363 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd2lk\" (UniqueName: \"kubernetes.io/projected/0f31e12e-2f94-40a3-a522-1aa44cb1cdbf-kube-api-access-cd2lk\") pod \"nova-cell1-conductor-0\" (UID: \"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf\") " pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.138381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.151389 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.283694 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle\") pod \"57820ba4-dfb8-40a5-be44-26fc6fe01967\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.283972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tssrd\" (UniqueName: \"kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd\") pod \"57820ba4-dfb8-40a5-be44-26fc6fe01967\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.284043 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts\") pod \"57820ba4-dfb8-40a5-be44-26fc6fe01967\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.284078 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data\") pod \"57820ba4-dfb8-40a5-be44-26fc6fe01967\" (UID: \"57820ba4-dfb8-40a5-be44-26fc6fe01967\") " Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.292889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts" (OuterVolumeSpecName: "scripts") pod "57820ba4-dfb8-40a5-be44-26fc6fe01967" (UID: "57820ba4-dfb8-40a5-be44-26fc6fe01967"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.293684 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd" (OuterVolumeSpecName: "kube-api-access-tssrd") pod "57820ba4-dfb8-40a5-be44-26fc6fe01967" (UID: "57820ba4-dfb8-40a5-be44-26fc6fe01967"). InnerVolumeSpecName "kube-api-access-tssrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.335067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data" (OuterVolumeSpecName: "config-data") pod "57820ba4-dfb8-40a5-be44-26fc6fe01967" (UID: "57820ba4-dfb8-40a5-be44-26fc6fe01967"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.342430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57820ba4-dfb8-40a5-be44-26fc6fe01967" (UID: "57820ba4-dfb8-40a5-be44-26fc6fe01967"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.387426 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.387459 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tssrd\" (UniqueName: \"kubernetes.io/projected/57820ba4-dfb8-40a5-be44-26fc6fe01967-kube-api-access-tssrd\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.387515 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.387528 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57820ba4-dfb8-40a5-be44-26fc6fe01967-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.596289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.596678 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.643484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 12:12:15 crc kubenswrapper[4763]: W1205 12:12:15.652508 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f31e12e_2f94_40a3_a522_1aa44cb1cdbf.slice/crio-a1266191f7dcc17287b59249b11d84a55f5b6397544870e05ee02ade0913763b WatchSource:0}: Error finding container a1266191f7dcc17287b59249b11d84a55f5b6397544870e05ee02ade0913763b: Status 404 returned error can't find the container with id a1266191f7dcc17287b59249b11d84a55f5b6397544870e05ee02ade0913763b Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.751543 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h96f9" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.751599 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h96f9" event={"ID":"57820ba4-dfb8-40a5-be44-26fc6fe01967","Type":"ContainerDied","Data":"638d7fd69b08ed888fc03240e0d7e6e18e109da92c8fa6411dd97305f6d77c5d"} Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.751646 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638d7fd69b08ed888fc03240e0d7e6e18e109da92c8fa6411dd97305f6d77c5d" Dec 05 12:12:15 crc kubenswrapper[4763]: I1205 12:12:15.753628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf","Type":"ContainerStarted","Data":"a1266191f7dcc17287b59249b11d84a55f5b6397544870e05ee02ade0913763b"} Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.438691 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.533870 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534218 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534307 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534336 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npjjp\" (UniqueName: \"kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.534412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd\") pod \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\" (UID: \"d798ac3d-6708-49fb-8ba6-b635ee5b769e\") " Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.535024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.535265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.541387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp" (OuterVolumeSpecName: "kube-api-access-npjjp") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "kube-api-access-npjjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.551268 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts" (OuterVolumeSpecName: "scripts") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.577648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.605280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.634644 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637059 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637098 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637114 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npjjp\" (UniqueName: \"kubernetes.io/projected/d798ac3d-6708-49fb-8ba6-b635ee5b769e-kube-api-access-npjjp\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637127 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d798ac3d-6708-49fb-8ba6-b635ee5b769e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637138 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637149 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.637160 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.662777 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cqhd5" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="registry-server" probeResult="failure" output=< Dec 05 12:12:16 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 12:12:16 crc kubenswrapper[4763]: > Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.701742 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data" (OuterVolumeSpecName: "config-data") pod "d798ac3d-6708-49fb-8ba6-b635ee5b769e" (UID: "d798ac3d-6708-49fb-8ba6-b635ee5b769e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.738370 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d798ac3d-6708-49fb-8ba6-b635ee5b769e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.764167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f31e12e-2f94-40a3-a522-1aa44cb1cdbf","Type":"ContainerStarted","Data":"dfb0660c34fdb41e687944bfae6b1d4e714c5e1319221983b98346682fab3ae7"} Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.764356 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.766520 4763 generic.go:334] "Generic (PLEG): container finished" podID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerID="a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae" exitCode=0 Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.766564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerDied","Data":"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae"} Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.766719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d798ac3d-6708-49fb-8ba6-b635ee5b769e","Type":"ContainerDied","Data":"60b692fdf536497150e24a537d970d95f6bdae69857aca490c4d63d3addf8631"} Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.766794 4763 scope.go:117] "RemoveContainer" containerID="bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.766588 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.789127 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.789112858 podStartE2EDuration="2.789112858s" podCreationTimestamp="2025-12-05 12:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:16.778924631 +0000 UTC m=+1421.271639354" watchObservedRunningTime="2025-12-05 12:12:16.789112858 +0000 UTC m=+1421.281827581" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.800030 4763 scope.go:117] "RemoveContainer" containerID="c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.813260 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.826094 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.852936 4763 scope.go:117] "RemoveContainer" containerID="775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.858630 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.859044 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-central-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859061 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-central-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.859079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="proxy-httpd" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859085 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="proxy-httpd" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.859094 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-notification-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859101 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-notification-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.859114 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57820ba4-dfb8-40a5-be44-26fc6fe01967" containerName="nova-manage" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859120 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="57820ba4-dfb8-40a5-be44-26fc6fe01967" containerName="nova-manage" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.859152 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="sg-core" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859158 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="sg-core" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859310 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-notification-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859319 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="ceilometer-central-agent" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859333 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="sg-core" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859352 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" containerName="proxy-httpd" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.859362 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="57820ba4-dfb8-40a5-be44-26fc6fe01967" containerName="nova-manage" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.860983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.863474 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.863564 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.863643 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.883238 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.918870 4763 scope.go:117] "RemoveContainer" containerID="a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.937779 4763 scope.go:117] "RemoveContainer" containerID="bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.938251 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6\": container with ID starting with bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6 not found: ID does not exist" containerID="bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938294 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6"} err="failed to get container status \"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6\": rpc error: code = NotFound desc = could not find container \"bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6\": container with ID starting with bf80e0c1d206586970153514ee7afd2d95d3cb0b99665cd25d7cdfdfdd5d88e6 not found: ID does not exist" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938321 4763 scope.go:117] "RemoveContainer" containerID="c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.938594 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6\": container with ID starting with c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6 not found: ID does not exist" containerID="c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938628 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6"} err="failed to get container status \"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6\": rpc error: code = NotFound desc = could not find container \"c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6\": container with ID starting with c0883ac1444536b8d3ba36a26cd33f9736d2e0ab69f56355ab790d1bc938b7e6 not found: ID does not exist" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938666 4763 scope.go:117] "RemoveContainer" containerID="775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.938910 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4\": container with ID starting with 775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4 not found: ID does not exist" containerID="775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938937 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4"} err="failed to get container status \"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4\": rpc error: code = NotFound desc = could not find container \"775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4\": container with ID starting with 775d120302eee3ff5c2f5a7d2c6e17c000e8bf42b825ad98f65bf871de3abfb4 not found: ID does not exist" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.938956 4763 scope.go:117] "RemoveContainer" containerID="a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae" Dec 05 12:12:16 crc kubenswrapper[4763]: E1205 12:12:16.939166 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae\": container with ID starting with a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae not found: ID does not exist" containerID="a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.939192 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae"} err="failed to get container status \"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae\": rpc error: code = NotFound desc = could not find container \"a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae\": container with ID starting with a15d864cf9d136d492cde4713454e84c0aa675f56903093bfb03fb34d39d1fae not found: ID does not exist" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942572 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57p69\" (UniqueName: \"kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.942884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.943145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:16 crc kubenswrapper[4763]: I1205 12:12:16.943202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.044810 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045199 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57p69\" (UniqueName: \"kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045415 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.045441 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.046608 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.047570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.050995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.051867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.051942 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.052719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.059858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.062509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57p69\" (UniqueName: \"kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69\") pod \"ceilometer-0\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.187980 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:17 crc kubenswrapper[4763]: W1205 12:12:17.697721 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5726fc7e_7c45_4714_a0d5_65b124f7d692.slice/crio-36a1ae5bbfa9f29d69947287bdcbcc63d2abe6af558d934a41b738ac91d731f5 WatchSource:0}: Error finding container 36a1ae5bbfa9f29d69947287bdcbcc63d2abe6af558d934a41b738ac91d731f5: Status 404 returned error can't find the container with id 36a1ae5bbfa9f29d69947287bdcbcc63d2abe6af558d934a41b738ac91d731f5 Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.699590 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.801548 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d798ac3d-6708-49fb-8ba6-b635ee5b769e" path="/var/lib/kubelet/pods/d798ac3d-6708-49fb-8ba6-b635ee5b769e/volumes" Dec 05 12:12:17 crc kubenswrapper[4763]: I1205 12:12:17.803596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerStarted","Data":"36a1ae5bbfa9f29d69947287bdcbcc63d2abe6af558d934a41b738ac91d731f5"} Dec 05 12:12:18 crc kubenswrapper[4763]: I1205 12:12:18.803620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerStarted","Data":"070b741db7d8658d18c20fa993edf281929be0b3578aecbd744c33e26b835c89"} Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.152175 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.153681 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.369543 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.369893 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.439451 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.827362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerStarted","Data":"87afc456c686737b618f84d1e2b3a90a9a7089d716f466d836746470dfb5b56a"} Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.934559 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.973018 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.973523 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-log" containerID="cri-o://cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe" gracePeriod=30 Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.973611 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-metadata" containerID="cri-o://dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22" gracePeriod=30 Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.993406 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": EOF" Dec 05 12:12:19 crc kubenswrapper[4763]: I1205 12:12:19.993507 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": EOF" Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.175707 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.235989 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.236082 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.838214 4763 generic.go:334] "Generic (PLEG): container finished" podID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerID="cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe" exitCode=143 Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.838285 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerDied","Data":"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe"} Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.841302 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-log" containerID="cri-o://659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969" gracePeriod=30 Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.841587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerStarted","Data":"bf79196b78560ffce260527941cf671183756e3cf6b0d1745159886679290871"} Dec 05 12:12:20 crc kubenswrapper[4763]: I1205 12:12:20.841973 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-api" containerID="cri-o://284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921" gracePeriod=30 Dec 05 12:12:21 crc kubenswrapper[4763]: I1205 12:12:21.877542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerStarted","Data":"09ac658728be057cb3fc01ec6c02d96e933089a6ea5ae8ef63465bf8f93e8c67"} Dec 05 12:12:21 crc kubenswrapper[4763]: I1205 12:12:21.878860 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:12:21 crc kubenswrapper[4763]: I1205 12:12:21.884917 4763 generic.go:334] "Generic (PLEG): container finished" podID="63404240-1608-4923-84ce-48d6686c2f2f" containerID="659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969" exitCode=143 Dec 05 12:12:21 crc kubenswrapper[4763]: I1205 12:12:21.884965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerDied","Data":"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969"} Dec 05 12:12:21 crc kubenswrapper[4763]: I1205 12:12:21.915025 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.691411736 podStartE2EDuration="5.915005297s" podCreationTimestamp="2025-12-05 12:12:16 +0000 UTC" firstStartedPulling="2025-12-05 12:12:17.705495555 +0000 UTC m=+1422.198210278" lastFinishedPulling="2025-12-05 12:12:20.929089116 +0000 UTC m=+1425.421803839" observedRunningTime="2025-12-05 12:12:21.913936953 +0000 UTC m=+1426.406651676" watchObservedRunningTime="2025-12-05 12:12:21.915005297 +0000 UTC m=+1426.407720020" Dec 05 12:12:25 crc kubenswrapper[4763]: I1205 12:12:25.660322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:25 crc kubenswrapper[4763]: I1205 12:12:25.705665 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.285700 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.731398 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.888381 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data\") pod \"63404240-1608-4923-84ce-48d6686c2f2f\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951297 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km8z8\" (UniqueName: \"kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8\") pod \"63404240-1608-4923-84ce-48d6686c2f2f\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951328 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs\") pod \"6552af9b-be77-4433-b5b1-023b3f13ca28\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951350 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data\") pod \"6552af9b-be77-4433-b5b1-023b3f13ca28\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle\") pod \"63404240-1608-4923-84ce-48d6686c2f2f\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle\") pod \"6552af9b-be77-4433-b5b1-023b3f13ca28\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs\") pod \"63404240-1608-4923-84ce-48d6686c2f2f\" (UID: \"63404240-1608-4923-84ce-48d6686c2f2f\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951516 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kks2m\" (UniqueName: \"kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m\") pod \"6552af9b-be77-4433-b5b1-023b3f13ca28\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.951591 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs\") pod \"6552af9b-be77-4433-b5b1-023b3f13ca28\" (UID: \"6552af9b-be77-4433-b5b1-023b3f13ca28\") " Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.955665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs" (OuterVolumeSpecName: "logs") pod "6552af9b-be77-4433-b5b1-023b3f13ca28" (UID: "6552af9b-be77-4433-b5b1-023b3f13ca28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.956845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs" (OuterVolumeSpecName: "logs") pod "63404240-1608-4923-84ce-48d6686c2f2f" (UID: "63404240-1608-4923-84ce-48d6686c2f2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.957960 4763 generic.go:334] "Generic (PLEG): container finished" podID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerID="dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22" exitCode=0 Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.958105 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerDied","Data":"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22"} Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.958159 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.958159 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8" (OuterVolumeSpecName: "kube-api-access-km8z8") pod "63404240-1608-4923-84ce-48d6686c2f2f" (UID: "63404240-1608-4923-84ce-48d6686c2f2f"). InnerVolumeSpecName "kube-api-access-km8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.958277 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6552af9b-be77-4433-b5b1-023b3f13ca28","Type":"ContainerDied","Data":"f8b8d9db9733148167c27c0e8f4d4b296c012c3671c23b675a2d8e14d739f273"} Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.958285 4763 scope.go:117] "RemoveContainer" containerID="dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.964841 4763 generic.go:334] "Generic (PLEG): container finished" podID="63404240-1608-4923-84ce-48d6686c2f2f" containerID="284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921" exitCode=0 Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.964915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerDied","Data":"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921"} Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.964944 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.964964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63404240-1608-4923-84ce-48d6686c2f2f","Type":"ContainerDied","Data":"21fdb623b96907e9f782b5715ebe1e9932223bcaf33ab8cfb75a4a8e93e322c1"} Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.965026 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cqhd5" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="registry-server" containerID="cri-o://e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a" gracePeriod=2 Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.966426 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m" (OuterVolumeSpecName: "kube-api-access-kks2m") pod "6552af9b-be77-4433-b5b1-023b3f13ca28" (UID: "6552af9b-be77-4433-b5b1-023b3f13ca28"). InnerVolumeSpecName "kube-api-access-kks2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.992586 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data" (OuterVolumeSpecName: "config-data") pod "6552af9b-be77-4433-b5b1-023b3f13ca28" (UID: "6552af9b-be77-4433-b5b1-023b3f13ca28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.992667 4763 scope.go:117] "RemoveContainer" containerID="cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe" Dec 05 12:12:26 crc kubenswrapper[4763]: I1205 12:12:26.998295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63404240-1608-4923-84ce-48d6686c2f2f" (UID: "63404240-1608-4923-84ce-48d6686c2f2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.008165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6552af9b-be77-4433-b5b1-023b3f13ca28" (UID: "6552af9b-be77-4433-b5b1-023b3f13ca28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.028542 4763 scope.go:117] "RemoveContainer" containerID="dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.029265 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22\": container with ID starting with dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22 not found: ID does not exist" containerID="dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.029503 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22"} err="failed to get container status \"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22\": rpc error: code = NotFound desc = could not find container \"dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22\": container with ID starting with dedb105b7d3e7a64015a99a176c96ae0c2bcd4db58d01be988bb393c3b5c4f22 not found: ID does not exist" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.029534 4763 scope.go:117] "RemoveContainer" containerID="cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.030019 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe\": container with ID starting with cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe not found: ID does not exist" containerID="cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.030046 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe"} err="failed to get container status \"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe\": rpc error: code = NotFound desc = could not find container \"cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe\": container with ID starting with cda7895ca37b96582922ae39cb62499ece34806a2e14872549911232c0a08fbe not found: ID does not exist" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.030067 4763 scope.go:117] "RemoveContainer" containerID="284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.034968 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data" (OuterVolumeSpecName: "config-data") pod "63404240-1608-4923-84ce-48d6686c2f2f" (UID: "63404240-1608-4923-84ce-48d6686c2f2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.053057 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6552af9b-be77-4433-b5b1-023b3f13ca28" (UID: "6552af9b-be77-4433-b5b1-023b3f13ca28"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.053935 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km8z8\" (UniqueName: \"kubernetes.io/projected/63404240-1608-4923-84ce-48d6686c2f2f-kube-api-access-km8z8\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.053966 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.053980 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.053993 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.054011 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552af9b-be77-4433-b5b1-023b3f13ca28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.054023 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63404240-1608-4923-84ce-48d6686c2f2f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.054033 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kks2m\" (UniqueName: \"kubernetes.io/projected/6552af9b-be77-4433-b5b1-023b3f13ca28-kube-api-access-kks2m\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.054046 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552af9b-be77-4433-b5b1-023b3f13ca28-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.054056 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63404240-1608-4923-84ce-48d6686c2f2f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.094165 4763 scope.go:117] "RemoveContainer" containerID="659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.117357 4763 scope.go:117] "RemoveContainer" containerID="284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.118238 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921\": container with ID starting with 284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921 not found: ID does not exist" containerID="284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.118287 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921"} err="failed to get container status \"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921\": rpc error: code = NotFound desc = could not find container \"284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921\": container with ID starting with 284e44420ca6a4ce40b83f02fb4713c58599921194aa9130fa02ac3dbad4a921 not found: ID does not exist" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.118348 4763 scope.go:117] "RemoveContainer" containerID="659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.118727 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969\": container with ID starting with 659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969 not found: ID does not exist" containerID="659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.118795 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969"} err="failed to get container status \"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969\": rpc error: code = NotFound desc = could not find container \"659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969\": container with ID starting with 659abd9031db0295b2f733bde346ceceaa5b96670cba55f261a8d9167a837969 not found: ID does not exist" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.366808 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.388848 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.396758 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.410903 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.423348 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.423860 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-api" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.423884 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-api" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.423902 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-log" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.423912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-log" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.423937 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-metadata" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.423943 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-metadata" Dec 05 12:12:27 crc kubenswrapper[4763]: E1205 12:12:27.423958 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-log" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.423964 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-log" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.424183 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-metadata" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.424206 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-api" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.424217 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" containerName="nova-metadata-log" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.424233 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="63404240-1608-4923-84ce-48d6686c2f2f" containerName="nova-api-log" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.425447 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.427518 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.427688 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.467535 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.469197 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.471138 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.479844 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.492516 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.578362 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.578543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.578645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.578711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np8jq\" (UniqueName: \"kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.578812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.579299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.579427 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.579523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5cn\" (UniqueName: \"kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.579548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.582184 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681427 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681454 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np8jq\" (UniqueName: \"kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5cn\" (UniqueName: \"kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681679 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.681717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.682219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.682504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.686323 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.686705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.687072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.687164 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.688419 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.699642 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5cn\" (UniqueName: \"kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn\") pod \"nova-metadata-0\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.700877 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np8jq\" (UniqueName: \"kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq\") pod \"nova-api-0\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.783186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities\") pod \"91392e07-ce32-4e73-a80f-17e4749ab9da\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.783290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9g6q\" (UniqueName: \"kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q\") pod \"91392e07-ce32-4e73-a80f-17e4749ab9da\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.783441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content\") pod \"91392e07-ce32-4e73-a80f-17e4749ab9da\" (UID: \"91392e07-ce32-4e73-a80f-17e4749ab9da\") " Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.784144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities" (OuterVolumeSpecName: "utilities") pod "91392e07-ce32-4e73-a80f-17e4749ab9da" (UID: "91392e07-ce32-4e73-a80f-17e4749ab9da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.787627 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q" (OuterVolumeSpecName: "kube-api-access-c9g6q") pod "91392e07-ce32-4e73-a80f-17e4749ab9da" (UID: "91392e07-ce32-4e73-a80f-17e4749ab9da"). InnerVolumeSpecName "kube-api-access-c9g6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.806023 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63404240-1608-4923-84ce-48d6686c2f2f" path="/var/lib/kubelet/pods/63404240-1608-4923-84ce-48d6686c2f2f/volumes" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.806799 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6552af9b-be77-4433-b5b1-023b3f13ca28" path="/var/lib/kubelet/pods/6552af9b-be77-4433-b5b1-023b3f13ca28/volumes" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.878884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.886430 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.886698 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9g6q\" (UniqueName: \"kubernetes.io/projected/91392e07-ce32-4e73-a80f-17e4749ab9da-kube-api-access-c9g6q\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.896288 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.904897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91392e07-ce32-4e73-a80f-17e4749ab9da" (UID: "91392e07-ce32-4e73-a80f-17e4749ab9da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:27 crc kubenswrapper[4763]: I1205 12:12:27.989633 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91392e07-ce32-4e73-a80f-17e4749ab9da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.006431 4763 generic.go:334] "Generic (PLEG): container finished" podID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerID="e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a" exitCode=0 Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.006600 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqhd5" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.006877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerDied","Data":"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a"} Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.006929 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqhd5" event={"ID":"91392e07-ce32-4e73-a80f-17e4749ab9da","Type":"ContainerDied","Data":"342c27b3a6bcd5363e52c8cadad03b14ac83d5623e6864ef938fdcb987e5d356"} Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.006953 4763 scope.go:117] "RemoveContainer" containerID="e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.043311 4763 scope.go:117] "RemoveContainer" containerID="6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.060269 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.069822 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cqhd5"] Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.100803 4763 scope.go:117] "RemoveContainer" containerID="119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.127371 4763 scope.go:117] "RemoveContainer" containerID="e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a" Dec 05 12:12:28 crc kubenswrapper[4763]: E1205 12:12:28.128085 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a\": container with ID starting with e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a not found: ID does not exist" containerID="e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.128131 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a"} err="failed to get container status \"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a\": rpc error: code = NotFound desc = could not find container \"e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a\": container with ID starting with e32edefe87d0c177b988f4e2df10bb07577e1ba9442bc3f26167851efe960d2a not found: ID does not exist" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.128159 4763 scope.go:117] "RemoveContainer" containerID="6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866" Dec 05 12:12:28 crc kubenswrapper[4763]: E1205 12:12:28.128639 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866\": container with ID starting with 6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866 not found: ID does not exist" containerID="6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.128681 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866"} err="failed to get container status \"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866\": rpc error: code = NotFound desc = could not find container \"6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866\": container with ID starting with 6d9c631169d495a295bc5b310b9d1e6358c88c8948967f5cd9ad4a97caee5866 not found: ID does not exist" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.128707 4763 scope.go:117] "RemoveContainer" containerID="119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26" Dec 05 12:12:28 crc kubenswrapper[4763]: E1205 12:12:28.129165 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26\": container with ID starting with 119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26 not found: ID does not exist" containerID="119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.129190 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26"} err="failed to get container status \"119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26\": rpc error: code = NotFound desc = could not find container \"119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26\": container with ID starting with 119cc38239dcc038b759369b0b938bcb54d77517d6c5a97615646cb92c146a26 not found: ID does not exist" Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.371287 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:28 crc kubenswrapper[4763]: W1205 12:12:28.375443 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode519d490_70c9_43d1_87a3_8b559eb60f16.slice/crio-78d9938f6a8c8ad846b748b7757776dcb16a8828cbad43b0ae4fb26d1bf6fe5c WatchSource:0}: Error finding container 78d9938f6a8c8ad846b748b7757776dcb16a8828cbad43b0ae4fb26d1bf6fe5c: Status 404 returned error can't find the container with id 78d9938f6a8c8ad846b748b7757776dcb16a8828cbad43b0ae4fb26d1bf6fe5c Dec 05 12:12:28 crc kubenswrapper[4763]: I1205 12:12:28.448129 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:28 crc kubenswrapper[4763]: W1205 12:12:28.453655 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda204f6d6_ff29_4689_bbb1_110757d005b6.slice/crio-8182f388d803ac0ff164e81262c83b8ca3b4211198c7673d49bbd4067783d81f WatchSource:0}: Error finding container 8182f388d803ac0ff164e81262c83b8ca3b4211198c7673d49bbd4067783d81f: Status 404 returned error can't find the container with id 8182f388d803ac0ff164e81262c83b8ca3b4211198c7673d49bbd4067783d81f Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.025118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerStarted","Data":"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.026618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerStarted","Data":"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.026742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerStarted","Data":"8182f388d803ac0ff164e81262c83b8ca3b4211198c7673d49bbd4067783d81f"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.026858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerStarted","Data":"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.026969 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerStarted","Data":"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.027061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerStarted","Data":"78d9938f6a8c8ad846b748b7757776dcb16a8828cbad43b0ae4fb26d1bf6fe5c"} Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.047001 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.046980798 podStartE2EDuration="2.046980798s" podCreationTimestamp="2025-12-05 12:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:29.043675971 +0000 UTC m=+1433.536390694" watchObservedRunningTime="2025-12-05 12:12:29.046980798 +0000 UTC m=+1433.539695521" Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.067038 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.067012181 podStartE2EDuration="2.067012181s" podCreationTimestamp="2025-12-05 12:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:29.061640059 +0000 UTC m=+1433.554354802" watchObservedRunningTime="2025-12-05 12:12:29.067012181 +0000 UTC m=+1433.559726924" Dec 05 12:12:29 crc kubenswrapper[4763]: I1205 12:12:29.797810 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" path="/var/lib/kubelet/pods/91392e07-ce32-4e73-a80f-17e4749ab9da/volumes" Dec 05 12:12:32 crc kubenswrapper[4763]: I1205 12:12:32.879096 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:12:32 crc kubenswrapper[4763]: I1205 12:12:32.879479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.707715 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.879448 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.879532 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.880401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j89n7\" (UniqueName: \"kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7\") pod \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.880488 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data\") pod \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.880586 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle\") pod \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\" (UID: \"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0\") " Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.886751 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7" (OuterVolumeSpecName: "kube-api-access-j89n7") pod "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" (UID: "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0"). InnerVolumeSpecName "kube-api-access-j89n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.898610 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.898670 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.910955 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data" (OuterVolumeSpecName: "config-data") pod "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" (UID: "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.916328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" (UID: "1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.960710 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.982685 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j89n7\" (UniqueName: \"kubernetes.io/projected/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-kube-api-access-j89n7\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.982731 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:37 crc kubenswrapper[4763]: I1205 12:12:37.982744 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.084116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data\") pod \"885001ef-6b9f-4d63-a690-fb3c78b7e037\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.085033 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l96m\" (UniqueName: \"kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m\") pod \"885001ef-6b9f-4d63-a690-fb3c78b7e037\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.085321 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle\") pod \"885001ef-6b9f-4d63-a690-fb3c78b7e037\" (UID: \"885001ef-6b9f-4d63-a690-fb3c78b7e037\") " Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.092132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m" (OuterVolumeSpecName: "kube-api-access-4l96m") pod "885001ef-6b9f-4d63-a690-fb3c78b7e037" (UID: "885001ef-6b9f-4d63-a690-fb3c78b7e037"). InnerVolumeSpecName "kube-api-access-4l96m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.113576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "885001ef-6b9f-4d63-a690-fb3c78b7e037" (UID: "885001ef-6b9f-4d63-a690-fb3c78b7e037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.114422 4763 generic.go:334] "Generic (PLEG): container finished" podID="885001ef-6b9f-4d63-a690-fb3c78b7e037" containerID="91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74" exitCode=137 Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.114491 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.114483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"885001ef-6b9f-4d63-a690-fb3c78b7e037","Type":"ContainerDied","Data":"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74"} Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.114554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"885001ef-6b9f-4d63-a690-fb3c78b7e037","Type":"ContainerDied","Data":"d9778779680b20f9b1365ea78e6686acb587d64684baaa9d01a6d02418be42eb"} Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.114584 4763 scope.go:117] "RemoveContainer" containerID="91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.117885 4763 generic.go:334] "Generic (PLEG): container finished" podID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" containerID="8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7" exitCode=137 Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.117926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0","Type":"ContainerDied","Data":"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7"} Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.117947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0","Type":"ContainerDied","Data":"f44fd6654fe30634d3233f2eece391e2f90333b149d23800931ac7309ba0254e"} Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.117990 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.134489 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data" (OuterVolumeSpecName: "config-data") pod "885001ef-6b9f-4d63-a690-fb3c78b7e037" (UID: "885001ef-6b9f-4d63-a690-fb3c78b7e037"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.151989 4763 scope.go:117] "RemoveContainer" containerID="91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.156728 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74\": container with ID starting with 91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74 not found: ID does not exist" containerID="91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.156805 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74"} err="failed to get container status \"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74\": rpc error: code = NotFound desc = could not find container \"91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74\": container with ID starting with 91b6126c344015cc9d980d66f232168f4d4b17c930f3b0425a8b3020a4790b74 not found: ID does not exist" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.156836 4763 scope.go:117] "RemoveContainer" containerID="8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.167851 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.188467 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l96m\" (UniqueName: \"kubernetes.io/projected/885001ef-6b9f-4d63-a690-fb3c78b7e037-kube-api-access-4l96m\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.188528 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.188548 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/885001ef-6b9f-4d63-a690-fb3c78b7e037-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.196889 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.208235 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.209029 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="extract-utilities" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209072 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="extract-utilities" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.209110 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="registry-server" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209118 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="registry-server" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.209154 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="extract-content" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209166 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="extract-content" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.209181 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885001ef-6b9f-4d63-a690-fb3c78b7e037" containerName="nova-scheduler-scheduler" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209186 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="885001ef-6b9f-4d63-a690-fb3c78b7e037" containerName="nova-scheduler-scheduler" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.209206 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209212 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209401 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="91392e07-ce32-4e73-a80f-17e4749ab9da" containerName="registry-server" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209431 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="885001ef-6b9f-4d63-a690-fb3c78b7e037" containerName="nova-scheduler-scheduler" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.209466 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.210178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.214037 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.215092 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.215436 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.220077 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.228867 4763 scope.go:117] "RemoveContainer" containerID="8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7" Dec 05 12:12:38 crc kubenswrapper[4763]: E1205 12:12:38.236963 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7\": container with ID starting with 8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7 not found: ID does not exist" containerID="8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.237028 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7"} err="failed to get container status \"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7\": rpc error: code = NotFound desc = could not find container \"8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7\": container with ID starting with 8ddf9c94aa959b7cb0d1eee10c6feba8d26b8039de83de9e4cc6ce8ac1ceccc7 not found: ID does not exist" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.392237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.392566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.392670 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhxg\" (UniqueName: \"kubernetes.io/projected/c3577198-f0dd-4145-a9c5-d29a0d18d212-kube-api-access-mmhxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.392705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.392726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.469007 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.495756 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.503583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.503671 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.504531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.505502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.505783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhxg\" (UniqueName: \"kubernetes.io/projected/c3577198-f0dd-4145-a9c5-d29a0d18d212-kube-api-access-mmhxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.543640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.544116 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.544483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.544718 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3577198-f0dd-4145-a9c5-d29a0d18d212-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.547592 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.548429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhxg\" (UniqueName: \"kubernetes.io/projected/c3577198-f0dd-4145-a9c5-d29a0d18d212-kube-api-access-mmhxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"c3577198-f0dd-4145-a9c5-d29a0d18d212\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.552607 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.556369 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.561222 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.710465 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx66f\" (UniqueName: \"kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.710693 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.710738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.812028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx66f\" (UniqueName: \"kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.812173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.812202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.815705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.816020 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.837974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.848120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx66f\" (UniqueName: \"kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f\") pod \"nova-scheduler-0\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.902911 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.902949 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.958831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.992450 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:38 crc kubenswrapper[4763]: I1205 12:12:38.992786 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:12:39 crc kubenswrapper[4763]: I1205 12:12:39.357312 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 12:12:39 crc kubenswrapper[4763]: I1205 12:12:39.553681 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:39 crc kubenswrapper[4763]: I1205 12:12:39.803713 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0" path="/var/lib/kubelet/pods/1e6f4f31-3bc9-4c4d-86c5-da50af2fa5e0/volumes" Dec 05 12:12:39 crc kubenswrapper[4763]: I1205 12:12:39.804424 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885001ef-6b9f-4d63-a690-fb3c78b7e037" path="/var/lib/kubelet/pods/885001ef-6b9f-4d63-a690-fb3c78b7e037/volumes" Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.158970 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a08ac027-eaef-4ad0-8af6-45110d1c49e2","Type":"ContainerStarted","Data":"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d"} Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.159530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a08ac027-eaef-4ad0-8af6-45110d1c49e2","Type":"ContainerStarted","Data":"847ccdcd56c03e9421d015395acf76fce4e94def7de8ac837ad59e0ac93601fa"} Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.161749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c3577198-f0dd-4145-a9c5-d29a0d18d212","Type":"ContainerStarted","Data":"3567f84072f55bf7c1bcd6c1d6d063b3b75f7139255782afcf6646c70d7b2e90"} Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.161808 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c3577198-f0dd-4145-a9c5-d29a0d18d212","Type":"ContainerStarted","Data":"efe61e1066d5a908405ffddf900f1ef3e103b5c024b4a8d2a08bceb805e0bdb1"} Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.183389 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.183373692 podStartE2EDuration="2.183373692s" podCreationTimestamp="2025-12-05 12:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:40.181345827 +0000 UTC m=+1444.674060570" watchObservedRunningTime="2025-12-05 12:12:40.183373692 +0000 UTC m=+1444.676088415" Dec 05 12:12:40 crc kubenswrapper[4763]: I1205 12:12:40.222536 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.2225118090000002 podStartE2EDuration="2.222511809s" podCreationTimestamp="2025-12-05 12:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:40.207389003 +0000 UTC m=+1444.700103746" watchObservedRunningTime="2025-12-05 12:12:40.222511809 +0000 UTC m=+1444.715226532" Dec 05 12:12:43 crc kubenswrapper[4763]: I1205 12:12:43.838413 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:43 crc kubenswrapper[4763]: I1205 12:12:43.960315 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.202375 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.884466 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.886148 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.898558 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.905661 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.905740 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.906190 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.906534 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.909982 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 12:12:47 crc kubenswrapper[4763]: I1205 12:12:47.910038 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.112841 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.114589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.133320 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qzw\" (UniqueName: \"kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222880 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.222929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.274214 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.326442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qzw\" (UniqueName: \"kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.327398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.329377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.330132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.330804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.331261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.368057 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qzw\" (UniqueName: \"kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw\") pod \"dnsmasq-dns-cd5cbd7b9-mg9lr\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.439323 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.838617 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.949664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.960377 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 12:12:48 crc kubenswrapper[4763]: I1205 12:12:48.996202 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:12:49 crc kubenswrapper[4763]: W1205 12:12:49.000549 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7f9c67e_8dbf_4604_9407_ba5199add7e2.slice/crio-641d46d2499535761a09210beced70e169ba66f6214c6a4e4149fd56efe05984 WatchSource:0}: Error finding container 641d46d2499535761a09210beced70e169ba66f6214c6a4e4149fd56efe05984: Status 404 returned error can't find the container with id 641d46d2499535761a09210beced70e169ba66f6214c6a4e4149fd56efe05984 Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.011977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.276966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" event={"ID":"c7f9c67e-8dbf-4604-9407-ba5199add7e2","Type":"ContainerStarted","Data":"641d46d2499535761a09210beced70e169ba66f6214c6a4e4149fd56efe05984"} Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.314173 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.338155 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.520651 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-qjz64"] Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.522140 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.524580 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.525463 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.548108 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qjz64"] Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.658529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.658591 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.658678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.658754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtwq\" (UniqueName: \"kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.760286 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.760394 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.760469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.760506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtwq\" (UniqueName: \"kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.766453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.767264 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.767296 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.793476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtwq\" (UniqueName: \"kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq\") pod \"nova-cell1-cell-mapping-qjz64\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:49 crc kubenswrapper[4763]: I1205 12:12:49.914086 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.265128 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.266009 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-central-agent" containerID="cri-o://070b741db7d8658d18c20fa993edf281929be0b3578aecbd744c33e26b835c89" gracePeriod=30 Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.266366 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="proxy-httpd" containerID="cri-o://09ac658728be057cb3fc01ec6c02d96e933089a6ea5ae8ef63465bf8f93e8c67" gracePeriod=30 Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.266432 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="sg-core" containerID="cri-o://bf79196b78560ffce260527941cf671183756e3cf6b0d1745159886679290871" gracePeriod=30 Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.266444 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-notification-agent" containerID="cri-o://87afc456c686737b618f84d1e2b3a90a9a7089d716f466d836746470dfb5b56a" gracePeriod=30 Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.290053 4763 generic.go:334] "Generic (PLEG): container finished" podID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerID="82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557" exitCode=0 Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.290290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" event={"ID":"c7f9c67e-8dbf-4604-9407-ba5199add7e2","Type":"ContainerDied","Data":"82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557"} Dec 05 12:12:50 crc kubenswrapper[4763]: I1205 12:12:50.454661 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qjz64"] Dec 05 12:12:50 crc kubenswrapper[4763]: W1205 12:12:50.466719 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31abbaf4_1fc8_4f73_b549_ec6e262a08d0.slice/crio-6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b WatchSource:0}: Error finding container 6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b: Status 404 returned error can't find the container with id 6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.317810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qjz64" event={"ID":"31abbaf4-1fc8-4f73-b549-ec6e262a08d0","Type":"ContainerStarted","Data":"f9da4deeab88d6c2e18623166d865cac62c2ed2c875a0585837ca15e1a656541"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.318187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qjz64" event={"ID":"31abbaf4-1fc8-4f73-b549-ec6e262a08d0","Type":"ContainerStarted","Data":"6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322336 4763 generic.go:334] "Generic (PLEG): container finished" podID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerID="09ac658728be057cb3fc01ec6c02d96e933089a6ea5ae8ef63465bf8f93e8c67" exitCode=0 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322379 4763 generic.go:334] "Generic (PLEG): container finished" podID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerID="bf79196b78560ffce260527941cf671183756e3cf6b0d1745159886679290871" exitCode=2 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322390 4763 generic.go:334] "Generic (PLEG): container finished" podID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerID="87afc456c686737b618f84d1e2b3a90a9a7089d716f466d836746470dfb5b56a" exitCode=0 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322399 4763 generic.go:334] "Generic (PLEG): container finished" podID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerID="070b741db7d8658d18c20fa993edf281929be0b3578aecbd744c33e26b835c89" exitCode=0 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322459 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerDied","Data":"09ac658728be057cb3fc01ec6c02d96e933089a6ea5ae8ef63465bf8f93e8c67"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerDied","Data":"bf79196b78560ffce260527941cf671183756e3cf6b0d1745159886679290871"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerDied","Data":"87afc456c686737b618f84d1e2b3a90a9a7089d716f466d836746470dfb5b56a"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.322532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerDied","Data":"070b741db7d8658d18c20fa993edf281929be0b3578aecbd744c33e26b835c89"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.337483 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.337893 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-api" containerID="cri-o://96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0" gracePeriod=30 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.338084 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-log" containerID="cri-o://662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13" gracePeriod=30 Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.346349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" event={"ID":"c7f9c67e-8dbf-4604-9407-ba5199add7e2","Type":"ContainerStarted","Data":"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51"} Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.351022 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.360376 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-qjz64" podStartSLOduration=2.36035154 podStartE2EDuration="2.36035154s" podCreationTimestamp="2025-12-05 12:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:51.351249997 +0000 UTC m=+1455.843964730" watchObservedRunningTime="2025-12-05 12:12:51.36035154 +0000 UTC m=+1455.853066283" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.398937 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" podStartSLOduration=3.398920348 podStartE2EDuration="3.398920348s" podCreationTimestamp="2025-12-05 12:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:51.372844911 +0000 UTC m=+1455.865559654" watchObservedRunningTime="2025-12-05 12:12:51.398920348 +0000 UTC m=+1455.891635071" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.545903 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57p69\" (UniqueName: \"kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614188 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614219 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614313 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614342 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614425 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614483 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd\") pod \"5726fc7e-7c45-4714-a0d5-65b124f7d692\" (UID: \"5726fc7e-7c45-4714-a0d5-65b124f7d692\") " Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.614961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.619966 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69" (OuterVolumeSpecName: "kube-api-access-57p69") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "kube-api-access-57p69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.621462 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.635059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts" (OuterVolumeSpecName: "scripts") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.669467 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.711156 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716538 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716565 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716574 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716582 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716590 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5726fc7e-7c45-4714-a0d5-65b124f7d692-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.716598 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57p69\" (UniqueName: \"kubernetes.io/projected/5726fc7e-7c45-4714-a0d5-65b124f7d692-kube-api-access-57p69\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.764349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.784066 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data" (OuterVolumeSpecName: "config-data") pod "5726fc7e-7c45-4714-a0d5-65b124f7d692" (UID: "5726fc7e-7c45-4714-a0d5-65b124f7d692"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.818146 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:51 crc kubenswrapper[4763]: I1205 12:12:51.818183 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5726fc7e-7c45-4714-a0d5-65b124f7d692-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.357698 4763 generic.go:334] "Generic (PLEG): container finished" podID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerID="662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13" exitCode=143 Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.357772 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerDied","Data":"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13"} Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.361623 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5726fc7e-7c45-4714-a0d5-65b124f7d692","Type":"ContainerDied","Data":"36a1ae5bbfa9f29d69947287bdcbcc63d2abe6af558d934a41b738ac91d731f5"} Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.361662 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.361686 4763 scope.go:117] "RemoveContainer" containerID="09ac658728be057cb3fc01ec6c02d96e933089a6ea5ae8ef63465bf8f93e8c67" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.435640 4763 scope.go:117] "RemoveContainer" containerID="bf79196b78560ffce260527941cf671183756e3cf6b0d1745159886679290871" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.447931 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.455300 4763 scope.go:117] "RemoveContainer" containerID="87afc456c686737b618f84d1e2b3a90a9a7089d716f466d836746470dfb5b56a" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.455382 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.480545 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:52 crc kubenswrapper[4763]: E1205 12:12:52.481244 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="proxy-httpd" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481266 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="proxy-httpd" Dec 05 12:12:52 crc kubenswrapper[4763]: E1205 12:12:52.481289 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-central-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481295 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-central-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: E1205 12:12:52.481311 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="sg-core" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481317 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="sg-core" Dec 05 12:12:52 crc kubenswrapper[4763]: E1205 12:12:52.481347 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-notification-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481353 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-notification-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481520 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-notification-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481539 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="proxy-httpd" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481551 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="ceilometer-central-agent" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.481563 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" containerName="sg-core" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.483418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.487459 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.487693 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.489545 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.490371 4763 scope.go:117] "RemoveContainer" containerID="070b741db7d8658d18c20fa993edf281929be0b3578aecbd744c33e26b835c89" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.493575 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6ks\" (UniqueName: \"kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.532549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.633850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.633928 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.633961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634054 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6ks\" (UniqueName: \"kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634606 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.634905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.640442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.640667 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.642361 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.642621 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.650735 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6ks\" (UniqueName: \"kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.658651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data\") pod \"ceilometer-0\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.802437 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:12:52 crc kubenswrapper[4763]: I1205 12:12:52.991208 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:53 crc kubenswrapper[4763]: W1205 12:12:53.301144 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d18f4e2_29a0_4968_ba78_3318c073e41e.slice/crio-932e79e09130813417286147d1760a8414f214939328d66bbb321f471564ca25 WatchSource:0}: Error finding container 932e79e09130813417286147d1760a8414f214939328d66bbb321f471564ca25: Status 404 returned error can't find the container with id 932e79e09130813417286147d1760a8414f214939328d66bbb321f471564ca25 Dec 05 12:12:53 crc kubenswrapper[4763]: I1205 12:12:53.304109 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:12:53 crc kubenswrapper[4763]: I1205 12:12:53.372683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerStarted","Data":"932e79e09130813417286147d1760a8414f214939328d66bbb321f471564ca25"} Dec 05 12:12:53 crc kubenswrapper[4763]: I1205 12:12:53.797206 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5726fc7e-7c45-4714-a0d5-65b124f7d692" path="/var/lib/kubelet/pods/5726fc7e-7c45-4714-a0d5-65b124f7d692/volumes" Dec 05 12:12:54 crc kubenswrapper[4763]: I1205 12:12:54.432017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerStarted","Data":"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d"} Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.427300 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.460283 4763 generic.go:334] "Generic (PLEG): container finished" podID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerID="96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0" exitCode=0 Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.460390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerDied","Data":"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0"} Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.460417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a204f6d6-ff29-4689-bbb1-110757d005b6","Type":"ContainerDied","Data":"8182f388d803ac0ff164e81262c83b8ca3b4211198c7673d49bbd4067783d81f"} Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.460434 4763 scope.go:117] "RemoveContainer" containerID="96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.460575 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.469351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerStarted","Data":"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3"} Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.509586 4763 scope.go:117] "RemoveContainer" containerID="662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.533215 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle\") pod \"a204f6d6-ff29-4689-bbb1-110757d005b6\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.533567 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np8jq\" (UniqueName: \"kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq\") pod \"a204f6d6-ff29-4689-bbb1-110757d005b6\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.533860 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data\") pod \"a204f6d6-ff29-4689-bbb1-110757d005b6\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.534002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs\") pod \"a204f6d6-ff29-4689-bbb1-110757d005b6\" (UID: \"a204f6d6-ff29-4689-bbb1-110757d005b6\") " Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.536283 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs" (OuterVolumeSpecName: "logs") pod "a204f6d6-ff29-4689-bbb1-110757d005b6" (UID: "a204f6d6-ff29-4689-bbb1-110757d005b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.545025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq" (OuterVolumeSpecName: "kube-api-access-np8jq") pod "a204f6d6-ff29-4689-bbb1-110757d005b6" (UID: "a204f6d6-ff29-4689-bbb1-110757d005b6"). InnerVolumeSpecName "kube-api-access-np8jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.571175 4763 scope.go:117] "RemoveContainer" containerID="96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0" Dec 05 12:12:55 crc kubenswrapper[4763]: E1205 12:12:55.579040 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0\": container with ID starting with 96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0 not found: ID does not exist" containerID="96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.579083 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0"} err="failed to get container status \"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0\": rpc error: code = NotFound desc = could not find container \"96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0\": container with ID starting with 96977af08a5295dd413de22cdb7acea9ed93d9267ad7a73ffc0280733aa91eb0 not found: ID does not exist" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.579088 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data" (OuterVolumeSpecName: "config-data") pod "a204f6d6-ff29-4689-bbb1-110757d005b6" (UID: "a204f6d6-ff29-4689-bbb1-110757d005b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.579110 4763 scope.go:117] "RemoveContainer" containerID="662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13" Dec 05 12:12:55 crc kubenswrapper[4763]: E1205 12:12:55.579574 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13\": container with ID starting with 662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13 not found: ID does not exist" containerID="662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.579635 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13"} err="failed to get container status \"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13\": rpc error: code = NotFound desc = could not find container \"662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13\": container with ID starting with 662dab9c6847115eab966d3cfd75d28eec15ab0c0e5622aa0383703f5badbd13 not found: ID does not exist" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.589983 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a204f6d6-ff29-4689-bbb1-110757d005b6" (UID: "a204f6d6-ff29-4689-bbb1-110757d005b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.637456 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np8jq\" (UniqueName: \"kubernetes.io/projected/a204f6d6-ff29-4689-bbb1-110757d005b6-kube-api-access-np8jq\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.637503 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.637518 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a204f6d6-ff29-4689-bbb1-110757d005b6-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.637530 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a204f6d6-ff29-4689-bbb1-110757d005b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.802869 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.820564 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.844139 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:55 crc kubenswrapper[4763]: E1205 12:12:55.844576 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-api" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.844592 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-api" Dec 05 12:12:55 crc kubenswrapper[4763]: E1205 12:12:55.844605 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-log" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.844611 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-log" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.844839 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-log" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.844857 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" containerName="nova-api-api" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.845928 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.850190 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.850421 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.850539 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.873832 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941713 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941804 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swfj2\" (UniqueName: \"kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941878 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:55 crc kubenswrapper[4763]: I1205 12:12:55.941999 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044138 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swfj2\" (UniqueName: \"kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.044707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.045158 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.048575 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.050840 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.052267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.052441 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.062545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swfj2\" (UniqueName: \"kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2\") pod \"nova-api-0\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.173271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.487002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerStarted","Data":"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9"} Dec 05 12:12:56 crc kubenswrapper[4763]: I1205 12:12:56.718224 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.512484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerStarted","Data":"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206"} Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.512856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerStarted","Data":"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf"} Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.512910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerStarted","Data":"756e91fa056785cbd4b03419778946859b34b87166ed4cc6234e2cb16ba4634b"} Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523256 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-central-agent" containerID="cri-o://cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d" gracePeriod=30 Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerStarted","Data":"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd"} Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523525 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523541 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="proxy-httpd" containerID="cri-o://3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd" gracePeriod=30 Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523576 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-notification-agent" containerID="cri-o://f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3" gracePeriod=30 Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.523626 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="sg-core" containerID="cri-o://214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9" gracePeriod=30 Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.526839 4763 generic.go:334] "Generic (PLEG): container finished" podID="31abbaf4-1fc8-4f73-b549-ec6e262a08d0" containerID="f9da4deeab88d6c2e18623166d865cac62c2ed2c875a0585837ca15e1a656541" exitCode=0 Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.526879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qjz64" event={"ID":"31abbaf4-1fc8-4f73-b549-ec6e262a08d0","Type":"ContainerDied","Data":"f9da4deeab88d6c2e18623166d865cac62c2ed2c875a0585837ca15e1a656541"} Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.555672 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.555648722 podStartE2EDuration="2.555648722s" podCreationTimestamp="2025-12-05 12:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:12:57.543199122 +0000 UTC m=+1462.035913865" watchObservedRunningTime="2025-12-05 12:12:57.555648722 +0000 UTC m=+1462.048363455" Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.599054 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.125993101 podStartE2EDuration="5.599032655s" podCreationTimestamp="2025-12-05 12:12:52 +0000 UTC" firstStartedPulling="2025-12-05 12:12:53.303599803 +0000 UTC m=+1457.796314526" lastFinishedPulling="2025-12-05 12:12:56.776639357 +0000 UTC m=+1461.269354080" observedRunningTime="2025-12-05 12:12:57.59544043 +0000 UTC m=+1462.088155173" watchObservedRunningTime="2025-12-05 12:12:57.599032655 +0000 UTC m=+1462.091747378" Dec 05 12:12:57 crc kubenswrapper[4763]: I1205 12:12:57.795947 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a204f6d6-ff29-4689-bbb1-110757d005b6" path="/var/lib/kubelet/pods/a204f6d6-ff29-4689-bbb1-110757d005b6/volumes" Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.441902 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.522336 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.522568 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="dnsmasq-dns" containerID="cri-o://f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932" gracePeriod=10 Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.552935 4763 generic.go:334] "Generic (PLEG): container finished" podID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerID="3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd" exitCode=0 Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.552985 4763 generic.go:334] "Generic (PLEG): container finished" podID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerID="214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9" exitCode=2 Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.552998 4763 generic.go:334] "Generic (PLEG): container finished" podID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerID="f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3" exitCode=0 Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.553219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerDied","Data":"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd"} Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.553253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerDied","Data":"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9"} Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.553266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerDied","Data":"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3"} Dec 05 12:12:58 crc kubenswrapper[4763]: I1205 12:12:58.965433 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.025451 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts\") pod \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.025516 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data\") pod \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.025928 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle\") pod \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.025966 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtwq\" (UniqueName: \"kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq\") pod \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\" (UID: \"31abbaf4-1fc8-4f73-b549-ec6e262a08d0\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.032251 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq" (OuterVolumeSpecName: "kube-api-access-hhtwq") pod "31abbaf4-1fc8-4f73-b549-ec6e262a08d0" (UID: "31abbaf4-1fc8-4f73-b549-ec6e262a08d0"). InnerVolumeSpecName "kube-api-access-hhtwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.036092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts" (OuterVolumeSpecName: "scripts") pod "31abbaf4-1fc8-4f73-b549-ec6e262a08d0" (UID: "31abbaf4-1fc8-4f73-b549-ec6e262a08d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.056918 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31abbaf4-1fc8-4f73-b549-ec6e262a08d0" (UID: "31abbaf4-1fc8-4f73-b549-ec6e262a08d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.076924 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data" (OuterVolumeSpecName: "config-data") pod "31abbaf4-1fc8-4f73-b549-ec6e262a08d0" (UID: "31abbaf4-1fc8-4f73-b549-ec6e262a08d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.128557 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.128859 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhtwq\" (UniqueName: \"kubernetes.io/projected/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-kube-api-access-hhtwq\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.128979 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.129064 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31abbaf4-1fc8-4f73-b549-ec6e262a08d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.179369 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.229933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.230864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.231080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2vwj\" (UniqueName: \"kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.231173 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.231451 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.231548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc\") pod \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\" (UID: \"ce666e93-3a67-456a-ad40-fd30c7ed0f7f\") " Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.237406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj" (OuterVolumeSpecName: "kube-api-access-k2vwj") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "kube-api-access-k2vwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.279647 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.284474 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.285100 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.286502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config" (OuterVolumeSpecName: "config") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.290981 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce666e93-3a67-456a-ad40-fd30c7ed0f7f" (UID: "ce666e93-3a67-456a-ad40-fd30c7ed0f7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334519 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2vwj\" (UniqueName: \"kubernetes.io/projected/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-kube-api-access-k2vwj\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334558 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334567 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334575 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334585 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.334593 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce666e93-3a67-456a-ad40-fd30c7ed0f7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.565908 4763 generic.go:334] "Generic (PLEG): container finished" podID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerID="f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932" exitCode=0 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.565998 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.565998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" event={"ID":"ce666e93-3a67-456a-ad40-fd30c7ed0f7f","Type":"ContainerDied","Data":"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932"} Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.566158 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-xwlmv" event={"ID":"ce666e93-3a67-456a-ad40-fd30c7ed0f7f","Type":"ContainerDied","Data":"5806ee48915675b20562c8f2ff880944acfb3f636608a253dcddb140083ef134"} Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.566189 4763 scope.go:117] "RemoveContainer" containerID="f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.571082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qjz64" event={"ID":"31abbaf4-1fc8-4f73-b549-ec6e262a08d0","Type":"ContainerDied","Data":"6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b"} Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.571119 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6854bae82c306d4b4007ad06d3a2af37d0290341c94cb04d78a60e80a797af5b" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.571122 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qjz64" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.598327 4763 scope.go:117] "RemoveContainer" containerID="c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.610124 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.623369 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-xwlmv"] Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.626216 4763 scope.go:117] "RemoveContainer" containerID="f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932" Dec 05 12:12:59 crc kubenswrapper[4763]: E1205 12:12:59.626660 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932\": container with ID starting with f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932 not found: ID does not exist" containerID="f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.626850 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932"} err="failed to get container status \"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932\": rpc error: code = NotFound desc = could not find container \"f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932\": container with ID starting with f465b792ba39e968bdb469de9f41655cbe572fec4021744abf1d55597bf6e932 not found: ID does not exist" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.626951 4763 scope.go:117] "RemoveContainer" containerID="c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9" Dec 05 12:12:59 crc kubenswrapper[4763]: E1205 12:12:59.627307 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9\": container with ID starting with c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9 not found: ID does not exist" containerID="c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.627351 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9"} err="failed to get container status \"c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9\": rpc error: code = NotFound desc = could not find container \"c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9\": container with ID starting with c38fc411805a6cbf12a368fe8de438591becf9a4f78ff70b0f6443f3673d0de9 not found: ID does not exist" Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.718995 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.719478 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" containerName="nova-scheduler-scheduler" containerID="cri-o://a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d" gracePeriod=30 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.730576 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.730798 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-log" containerID="cri-o://46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" gracePeriod=30 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.730922 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-api" containerID="cri-o://5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" gracePeriod=30 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.756749 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.757190 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" containerID="cri-o://f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d" gracePeriod=30 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.758660 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" containerID="cri-o://155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8" gracePeriod=30 Dec 05 12:12:59 crc kubenswrapper[4763]: I1205 12:12:59.806577 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" path="/var/lib/kubelet/pods/ce666e93-3a67-456a-ad40-fd30c7ed0f7f/volumes" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.305159 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.351754 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.351861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.351967 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.352089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swfj2\" (UniqueName: \"kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.352118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.352144 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data\") pod \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\" (UID: \"f1054e00-16a0-473b-b7a4-a3d00fc57c33\") " Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.353009 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs" (OuterVolumeSpecName: "logs") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.358860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2" (OuterVolumeSpecName: "kube-api-access-swfj2") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "kube-api-access-swfj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.384347 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data" (OuterVolumeSpecName: "config-data") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.400892 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.432568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.432906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1054e00-16a0-473b-b7a4-a3d00fc57c33" (UID: "f1054e00-16a0-473b-b7a4-a3d00fc57c33"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455057 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455122 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swfj2\" (UniqueName: \"kubernetes.io/projected/f1054e00-16a0-473b-b7a4-a3d00fc57c33-kube-api-access-swfj2\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455134 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455143 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455153 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1054e00-16a0-473b-b7a4-a3d00fc57c33-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.455177 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1054e00-16a0-473b-b7a4-a3d00fc57c33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.586176 4763 generic.go:334] "Generic (PLEG): container finished" podID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerID="155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8" exitCode=143 Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.586247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerDied","Data":"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8"} Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588502 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerID="5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" exitCode=0 Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588527 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerID="46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" exitCode=143 Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerDied","Data":"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206"} Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerDied","Data":"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf"} Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f1054e00-16a0-473b-b7a4-a3d00fc57c33","Type":"ContainerDied","Data":"756e91fa056785cbd4b03419778946859b34b87166ed4cc6234e2cb16ba4634b"} Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588628 4763 scope.go:117] "RemoveContainer" containerID="5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.588784 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.629429 4763 scope.go:117] "RemoveContainer" containerID="46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.663628 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.686813 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.709828 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.710255 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="init" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710273 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="init" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.710282 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31abbaf4-1fc8-4f73-b549-ec6e262a08d0" containerName="nova-manage" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710289 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="31abbaf4-1fc8-4f73-b549-ec6e262a08d0" containerName="nova-manage" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.710302 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="dnsmasq-dns" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710308 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="dnsmasq-dns" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.710333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-log" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710339 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-log" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.710351 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-api" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710357 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-api" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710561 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="31abbaf4-1fc8-4f73-b549-ec6e262a08d0" containerName="nova-manage" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710571 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-log" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710581 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" containerName="nova-api-api" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.710599 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce666e93-3a67-456a-ad40-fd30c7ed0f7f" containerName="dnsmasq-dns" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.711620 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.715934 4763 scope.go:117] "RemoveContainer" containerID="5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.716130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.716490 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206\": container with ID starting with 5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206 not found: ID does not exist" containerID="5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.716532 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206"} err="failed to get container status \"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206\": rpc error: code = NotFound desc = could not find container \"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206\": container with ID starting with 5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206 not found: ID does not exist" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.716554 4763 scope.go:117] "RemoveContainer" containerID="46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" Dec 05 12:13:00 crc kubenswrapper[4763]: E1205 12:13:00.717314 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf\": container with ID starting with 46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf not found: ID does not exist" containerID="46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.717336 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf"} err="failed to get container status \"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf\": rpc error: code = NotFound desc = could not find container \"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf\": container with ID starting with 46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf not found: ID does not exist" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.717350 4763 scope.go:117] "RemoveContainer" containerID="5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.717519 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206"} err="failed to get container status \"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206\": rpc error: code = NotFound desc = could not find container \"5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206\": container with ID starting with 5fccb5a488d4a270c02f08311bfd118045ab96db6343f0b482b94df31e5ea206 not found: ID does not exist" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.717538 4763 scope.go:117] "RemoveContainer" containerID="46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.717770 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf"} err="failed to get container status \"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf\": rpc error: code = NotFound desc = could not find container \"46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf\": container with ID starting with 46e90415549d130406bcb217d7c0ad867f6998faf7e293185102ce0fee4c18bf not found: ID does not exist" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.724355 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.724967 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.736739 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.766906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.766963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-logs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.767027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.767102 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9qq\" (UniqueName: \"kubernetes.io/projected/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-kube-api-access-ld9qq\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.767121 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-public-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.767144 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-config-data\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.868725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.868927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9qq\" (UniqueName: \"kubernetes.io/projected/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-kube-api-access-ld9qq\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.868961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-public-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.869000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-config-data\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.869094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.869141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-logs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.870188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-logs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.874202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-public-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.874700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-config-data\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.874750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.876131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:00 crc kubenswrapper[4763]: I1205 12:13:00.885740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9qq\" (UniqueName: \"kubernetes.io/projected/2ad0e748-bb1a-4b4f-bc70-f059e4fc3614-kube-api-access-ld9qq\") pod \"nova-api-0\" (UID: \"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614\") " pod="openstack/nova-api-0" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.041844 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.448418 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.586836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle\") pod \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.586982 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data\") pod \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.587106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx66f\" (UniqueName: \"kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f\") pod \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\" (UID: \"a08ac027-eaef-4ad0-8af6-45110d1c49e2\") " Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.592056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f" (OuterVolumeSpecName: "kube-api-access-fx66f") pod "a08ac027-eaef-4ad0-8af6-45110d1c49e2" (UID: "a08ac027-eaef-4ad0-8af6-45110d1c49e2"). InnerVolumeSpecName "kube-api-access-fx66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.611012 4763 generic.go:334] "Generic (PLEG): container finished" podID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" containerID="a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d" exitCode=0 Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.611086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a08ac027-eaef-4ad0-8af6-45110d1c49e2","Type":"ContainerDied","Data":"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d"} Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.611118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a08ac027-eaef-4ad0-8af6-45110d1c49e2","Type":"ContainerDied","Data":"847ccdcd56c03e9421d015395acf76fce4e94def7de8ac837ad59e0ac93601fa"} Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.611138 4763 scope.go:117] "RemoveContainer" containerID="a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.611948 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.629927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data" (OuterVolumeSpecName: "config-data") pod "a08ac027-eaef-4ad0-8af6-45110d1c49e2" (UID: "a08ac027-eaef-4ad0-8af6-45110d1c49e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.635198 4763 scope.go:117] "RemoveContainer" containerID="a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d" Dec 05 12:13:01 crc kubenswrapper[4763]: E1205 12:13:01.635744 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d\": container with ID starting with a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d not found: ID does not exist" containerID="a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.635801 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d"} err="failed to get container status \"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d\": rpc error: code = NotFound desc = could not find container \"a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d\": container with ID starting with a6e87089becd202adebcc2419b7b154dc094b7c9d8cea46d6522ca5bc12f2f6d not found: ID does not exist" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.639605 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 12:13:01 crc kubenswrapper[4763]: W1205 12:13:01.641364 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ad0e748_bb1a_4b4f_bc70_f059e4fc3614.slice/crio-00bdc51bf4635816f531679e6fd3441d3c258518cbf551da8a9eaa8575261eae WatchSource:0}: Error finding container 00bdc51bf4635816f531679e6fd3441d3c258518cbf551da8a9eaa8575261eae: Status 404 returned error can't find the container with id 00bdc51bf4635816f531679e6fd3441d3c258518cbf551da8a9eaa8575261eae Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.642153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a08ac027-eaef-4ad0-8af6-45110d1c49e2" (UID: "a08ac027-eaef-4ad0-8af6-45110d1c49e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.689670 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx66f\" (UniqueName: \"kubernetes.io/projected/a08ac027-eaef-4ad0-8af6-45110d1c49e2-kube-api-access-fx66f\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.689961 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.689971 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08ac027-eaef-4ad0-8af6-45110d1c49e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.799527 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1054e00-16a0-473b-b7a4-a3d00fc57c33" path="/var/lib/kubelet/pods/f1054e00-16a0-473b-b7a4-a3d00fc57c33/volumes" Dec 05 12:13:01 crc kubenswrapper[4763]: I1205 12:13:01.993774 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.009094 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.019091 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:13:02 crc kubenswrapper[4763]: E1205 12:13:02.019567 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" containerName="nova-scheduler-scheduler" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.019588 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" containerName="nova-scheduler-scheduler" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.019800 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" containerName="nova-scheduler-scheduler" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.031959 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.035822 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.036328 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.096219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.096475 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-config-data\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.096497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vvmt\" (UniqueName: \"kubernetes.io/projected/e011a0f4-fec9-4c12-a229-2e63ef03037d-kube-api-access-6vvmt\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.198709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.198806 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-config-data\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.198838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vvmt\" (UniqueName: \"kubernetes.io/projected/e011a0f4-fec9-4c12-a229-2e63ef03037d-kube-api-access-6vvmt\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.204434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-config-data\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.204483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e011a0f4-fec9-4c12-a229-2e63ef03037d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.221343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vvmt\" (UniqueName: \"kubernetes.io/projected/e011a0f4-fec9-4c12-a229-2e63ef03037d-kube-api-access-6vvmt\") pod \"nova-scheduler-0\" (UID: \"e011a0f4-fec9-4c12-a229-2e63ef03037d\") " pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.352723 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.624049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614","Type":"ContainerStarted","Data":"aee5c7dbdb58ed104663d4ab3c30b3b5dd252b049c9c8b9775c62726c99eb33d"} Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.624398 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614","Type":"ContainerStarted","Data":"cb0261e7776a709ab17d8075cceb5cd52c69f2d63432c57655dcbf073d77d728"} Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.624415 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ad0e748-bb1a-4b4f-bc70-f059e4fc3614","Type":"ContainerStarted","Data":"00bdc51bf4635816f531679e6fd3441d3c258518cbf551da8a9eaa8575261eae"} Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.647357 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.647338461 podStartE2EDuration="2.647338461s" podCreationTimestamp="2025-12-05 12:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:13:02.643119546 +0000 UTC m=+1467.135834289" watchObservedRunningTime="2025-12-05 12:13:02.647338461 +0000 UTC m=+1467.140053184" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.806381 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.890340 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": read tcp 10.217.0.2:57032->10.217.0.212:8775: read: connection reset by peer" Dec 05 12:13:02 crc kubenswrapper[4763]: I1205 12:13:02.890411 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": read tcp 10.217.0.2:57034->10.217.0.212:8775: read: connection reset by peer" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.321744 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.421980 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs\") pod \"e519d490-70c9-43d1-87a3-8b559eb60f16\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.422045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle\") pod \"e519d490-70c9-43d1-87a3-8b559eb60f16\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.422170 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs\") pod \"e519d490-70c9-43d1-87a3-8b559eb60f16\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.422246 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data\") pod \"e519d490-70c9-43d1-87a3-8b559eb60f16\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.422327 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp5cn\" (UniqueName: \"kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn\") pod \"e519d490-70c9-43d1-87a3-8b559eb60f16\" (UID: \"e519d490-70c9-43d1-87a3-8b559eb60f16\") " Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.424091 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs" (OuterVolumeSpecName: "logs") pod "e519d490-70c9-43d1-87a3-8b559eb60f16" (UID: "e519d490-70c9-43d1-87a3-8b559eb60f16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.427804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn" (OuterVolumeSpecName: "kube-api-access-sp5cn") pod "e519d490-70c9-43d1-87a3-8b559eb60f16" (UID: "e519d490-70c9-43d1-87a3-8b559eb60f16"). InnerVolumeSpecName "kube-api-access-sp5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.453398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data" (OuterVolumeSpecName: "config-data") pod "e519d490-70c9-43d1-87a3-8b559eb60f16" (UID: "e519d490-70c9-43d1-87a3-8b559eb60f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.453966 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e519d490-70c9-43d1-87a3-8b559eb60f16" (UID: "e519d490-70c9-43d1-87a3-8b559eb60f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.479121 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e519d490-70c9-43d1-87a3-8b559eb60f16" (UID: "e519d490-70c9-43d1-87a3-8b559eb60f16"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.525540 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e519d490-70c9-43d1-87a3-8b559eb60f16-logs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.525571 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.525585 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp5cn\" (UniqueName: \"kubernetes.io/projected/e519d490-70c9-43d1-87a3-8b559eb60f16-kube-api-access-sp5cn\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.525597 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.525610 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e519d490-70c9-43d1-87a3-8b559eb60f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.641481 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e011a0f4-fec9-4c12-a229-2e63ef03037d","Type":"ContainerStarted","Data":"48e1363ab5e403ca463d2bf7b05fdc274975813d00ca2df49c52d0ae478fdb73"} Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.641541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e011a0f4-fec9-4c12-a229-2e63ef03037d","Type":"ContainerStarted","Data":"4f20e106cd6a9bf96c4a10edad7c515ed8a160c2933a2ab601e79a5a2be1b8ab"} Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.645485 4763 generic.go:334] "Generic (PLEG): container finished" podID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerID="f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d" exitCode=0 Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.646498 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.652618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerDied","Data":"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d"} Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.652674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e519d490-70c9-43d1-87a3-8b559eb60f16","Type":"ContainerDied","Data":"78d9938f6a8c8ad846b748b7757776dcb16a8828cbad43b0ae4fb26d1bf6fe5c"} Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.652691 4763 scope.go:117] "RemoveContainer" containerID="f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.677500 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.677466513 podStartE2EDuration="2.677466513s" podCreationTimestamp="2025-12-05 12:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:13:03.671413008 +0000 UTC m=+1468.164127811" watchObservedRunningTime="2025-12-05 12:13:03.677466513 +0000 UTC m=+1468.170181316" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.699361 4763 scope.go:117] "RemoveContainer" containerID="155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.719687 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.733835 4763 scope.go:117] "RemoveContainer" containerID="f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d" Dec 05 12:13:03 crc kubenswrapper[4763]: E1205 12:13:03.734210 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d\": container with ID starting with f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d not found: ID does not exist" containerID="f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.734243 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d"} err="failed to get container status \"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d\": rpc error: code = NotFound desc = could not find container \"f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d\": container with ID starting with f544fce27c107d4c023d6841bea473b4de5912c8841f9b0822b2134fb0c4271d not found: ID does not exist" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.734267 4763 scope.go:117] "RemoveContainer" containerID="155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8" Dec 05 12:13:03 crc kubenswrapper[4763]: E1205 12:13:03.734478 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8\": container with ID starting with 155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8 not found: ID does not exist" containerID="155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.734504 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8"} err="failed to get container status \"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8\": rpc error: code = NotFound desc = could not find container \"155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8\": container with ID starting with 155194e26749779913f9654665839d02c729c47910f33980acacb654c4b898f8 not found: ID does not exist" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.740394 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.745909 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:13:03 crc kubenswrapper[4763]: E1205 12:13:03.748622 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.748652 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" Dec 05 12:13:03 crc kubenswrapper[4763]: E1205 12:13:03.748675 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.748684 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.748890 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-log" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.748913 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" containerName="nova-metadata-metadata" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.750182 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.752116 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.752485 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.757700 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.801975 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08ac027-eaef-4ad0-8af6-45110d1c49e2" path="/var/lib/kubelet/pods/a08ac027-eaef-4ad0-8af6-45110d1c49e2/volumes" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.803130 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e519d490-70c9-43d1-87a3-8b559eb60f16" path="/var/lib/kubelet/pods/e519d490-70c9-43d1-87a3-8b559eb60f16/volumes" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.837736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-logs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.837901 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.837933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hkc\" (UniqueName: \"kubernetes.io/projected/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-kube-api-access-q9hkc\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.837952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-config-data\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.838024 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.939329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.939386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hkc\" (UniqueName: \"kubernetes.io/projected/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-kube-api-access-q9hkc\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.939516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-config-data\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.939577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.939641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-logs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.940075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-logs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.944801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.945160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-config-data\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.950466 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:03 crc kubenswrapper[4763]: I1205 12:13:03.956909 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hkc\" (UniqueName: \"kubernetes.io/projected/cb8fec18-c1b6-47de-91cc-7ef68caceb0e-kube-api-access-q9hkc\") pod \"nova-metadata-0\" (UID: \"cb8fec18-c1b6-47de-91cc-7ef68caceb0e\") " pod="openstack/nova-metadata-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.102409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.140050 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.244477 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.244877 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.244909 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.244949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6ks\" (UniqueName: \"kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.244966 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245075 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245100 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs\") pod \"0d18f4e2-29a0-4968-ba78-3318c073e41e\" (UID: \"0d18f4e2-29a0-4968-ba78-3318c073e41e\") " Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245475 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.245738 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.251992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks" (OuterVolumeSpecName: "kube-api-access-pm6ks") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "kube-api-access-pm6ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.253916 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts" (OuterVolumeSpecName: "scripts") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.286974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.316949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.333485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347403 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6ks\" (UniqueName: \"kubernetes.io/projected/0d18f4e2-29a0-4968-ba78-3318c073e41e-kube-api-access-pm6ks\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347433 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347442 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347453 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347462 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.347469 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d18f4e2-29a0-4968-ba78-3318c073e41e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.360704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data" (OuterVolumeSpecName: "config-data") pod "0d18f4e2-29a0-4968-ba78-3318c073e41e" (UID: "0d18f4e2-29a0-4968-ba78-3318c073e41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.449128 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d18f4e2-29a0-4968-ba78-3318c073e41e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:04 crc kubenswrapper[4763]: W1205 12:13:04.630678 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb8fec18_c1b6_47de_91cc_7ef68caceb0e.slice/crio-2626bd3016694cb8f7aa7dd734a508437c93c578fba115cbe1fdb926f16ade1e WatchSource:0}: Error finding container 2626bd3016694cb8f7aa7dd734a508437c93c578fba115cbe1fdb926f16ade1e: Status 404 returned error can't find the container with id 2626bd3016694cb8f7aa7dd734a508437c93c578fba115cbe1fdb926f16ade1e Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.636391 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.661973 4763 generic.go:334] "Generic (PLEG): container finished" podID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerID="cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d" exitCode=0 Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.662007 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.662021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerDied","Data":"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d"} Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.663290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d18f4e2-29a0-4968-ba78-3318c073e41e","Type":"ContainerDied","Data":"932e79e09130813417286147d1760a8414f214939328d66bbb321f471564ca25"} Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.663310 4763 scope.go:117] "RemoveContainer" containerID="3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.667070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb8fec18-c1b6-47de-91cc-7ef68caceb0e","Type":"ContainerStarted","Data":"2626bd3016694cb8f7aa7dd734a508437c93c578fba115cbe1fdb926f16ade1e"} Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.702649 4763 scope.go:117] "RemoveContainer" containerID="214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.721824 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.738570 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.765845 4763 scope.go:117] "RemoveContainer" containerID="f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.784063 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.784941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="proxy-httpd" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.784962 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="proxy-httpd" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.784985 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-notification-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.784993 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-notification-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.785030 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-central-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785040 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-central-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.785057 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="sg-core" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785065 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="sg-core" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785483 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-central-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785510 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="sg-core" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785539 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="proxy-httpd" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.785569 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" containerName="ceilometer-notification-agent" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.791588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.797535 4763 scope.go:117] "RemoveContainer" containerID="cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.799016 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.799133 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.799140 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.815796 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.831262 4763 scope.go:117] "RemoveContainer" containerID="3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.832513 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd\": container with ID starting with 3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd not found: ID does not exist" containerID="3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.832564 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd"} err="failed to get container status \"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd\": rpc error: code = NotFound desc = could not find container \"3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd\": container with ID starting with 3decd5ac3438c56203c3b178252d1774b1b62226e1ba208827a8d58d6f9e24cd not found: ID does not exist" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.832591 4763 scope.go:117] "RemoveContainer" containerID="214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.834453 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9\": container with ID starting with 214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9 not found: ID does not exist" containerID="214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.834477 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9"} err="failed to get container status \"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9\": rpc error: code = NotFound desc = could not find container \"214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9\": container with ID starting with 214c8003bb3fb8a852bae4c412cd03e0ee53039b9e650095254d90b5a282a9c9 not found: ID does not exist" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.834492 4763 scope.go:117] "RemoveContainer" containerID="f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.834713 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3\": container with ID starting with f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3 not found: ID does not exist" containerID="f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.834728 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3"} err="failed to get container status \"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3\": rpc error: code = NotFound desc = could not find container \"f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3\": container with ID starting with f3b119e58c64cd8abc482d236d46a6f2cb3c583ad3ff170aae14e02602fadcf3 not found: ID does not exist" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.834739 4763 scope.go:117] "RemoveContainer" containerID="cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d" Dec 05 12:13:04 crc kubenswrapper[4763]: E1205 12:13:04.835635 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d\": container with ID starting with cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d not found: ID does not exist" containerID="cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.835678 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d"} err="failed to get container status \"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d\": rpc error: code = NotFound desc = could not find container \"cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d\": container with ID starting with cbbee0def389bd93b54846f23ce24a3267c56579048f419a29387a13ffe0629d not found: ID does not exist" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.873642 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.873746 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.873939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-scripts\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.874061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-config-data\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.874237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.874420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkhb\" (UniqueName: \"kubernetes.io/projected/948c2855-16a9-47e2-96a4-70fe90181d9e-kube-api-access-9lkhb\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.874445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-log-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.874561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-run-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.975975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-config-data\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkhb\" (UniqueName: \"kubernetes.io/projected/948c2855-16a9-47e2-96a4-70fe90181d9e-kube-api-access-9lkhb\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976151 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-log-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-run-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.976276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-scripts\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.977631 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-log-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.978093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948c2855-16a9-47e2-96a4-70fe90181d9e-run-httpd\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.980514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-scripts\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.981739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.981982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.987458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.993704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948c2855-16a9-47e2-96a4-70fe90181d9e-config-data\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:04 crc kubenswrapper[4763]: I1205 12:13:04.993873 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkhb\" (UniqueName: \"kubernetes.io/projected/948c2855-16a9-47e2-96a4-70fe90181d9e-kube-api-access-9lkhb\") pod \"ceilometer-0\" (UID: \"948c2855-16a9-47e2-96a4-70fe90181d9e\") " pod="openstack/ceilometer-0" Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.129479 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 12:13:05 crc kubenswrapper[4763]: W1205 12:13:05.605276 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod948c2855_16a9_47e2_96a4_70fe90181d9e.slice/crio-bc18704e9c488c20b9dc6ad726a7fefe26121c0451c331308659678f74b28a86 WatchSource:0}: Error finding container bc18704e9c488c20b9dc6ad726a7fefe26121c0451c331308659678f74b28a86: Status 404 returned error can't find the container with id bc18704e9c488c20b9dc6ad726a7fefe26121c0451c331308659678f74b28a86 Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.606132 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.680248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb8fec18-c1b6-47de-91cc-7ef68caceb0e","Type":"ContainerStarted","Data":"230d6e9c661103ba86cd60c7999b2c1b7cddf232c97d29b541b42e953fe9193f"} Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.680311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb8fec18-c1b6-47de-91cc-7ef68caceb0e","Type":"ContainerStarted","Data":"a1a7d589bb4dfc10016b12d296f31d650bcd06e667fe3bc7c6c5e03cb1a03ec9"} Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.682483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948c2855-16a9-47e2-96a4-70fe90181d9e","Type":"ContainerStarted","Data":"bc18704e9c488c20b9dc6ad726a7fefe26121c0451c331308659678f74b28a86"} Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.716555 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7165254130000003 podStartE2EDuration="2.716525413s" podCreationTimestamp="2025-12-05 12:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:13:05.701908843 +0000 UTC m=+1470.194623566" watchObservedRunningTime="2025-12-05 12:13:05.716525413 +0000 UTC m=+1470.209240146" Dec 05 12:13:05 crc kubenswrapper[4763]: I1205 12:13:05.801400 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d18f4e2-29a0-4968-ba78-3318c073e41e" path="/var/lib/kubelet/pods/0d18f4e2-29a0-4968-ba78-3318c073e41e/volumes" Dec 05 12:13:06 crc kubenswrapper[4763]: I1205 12:13:06.692730 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948c2855-16a9-47e2-96a4-70fe90181d9e","Type":"ContainerStarted","Data":"1a1cafb6a98652dc9978725d15d678f92846d4ba576a18c5bd5ad14381c326c0"} Dec 05 12:13:07 crc kubenswrapper[4763]: I1205 12:13:07.353076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 12:13:07 crc kubenswrapper[4763]: I1205 12:13:07.714146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948c2855-16a9-47e2-96a4-70fe90181d9e","Type":"ContainerStarted","Data":"31698661b90c7400dc17e3369dabeb631ef1b7428f0bdf074939e71d51a9a7fb"} Dec 05 12:13:07 crc kubenswrapper[4763]: I1205 12:13:07.714206 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948c2855-16a9-47e2-96a4-70fe90181d9e","Type":"ContainerStarted","Data":"ed092a50639cfdb80ac3794b2f1a5202019871422aa4b89dac4ff3f62fadee60"} Dec 05 12:13:08 crc kubenswrapper[4763]: I1205 12:13:08.727726 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948c2855-16a9-47e2-96a4-70fe90181d9e","Type":"ContainerStarted","Data":"2fe8a7e03267d2ebbc16ef7d728d3ea6ef1924b94efff83e55415aca52596371"} Dec 05 12:13:08 crc kubenswrapper[4763]: I1205 12:13:08.728404 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 12:13:08 crc kubenswrapper[4763]: I1205 12:13:08.753513 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.886238123 podStartE2EDuration="4.753495999s" podCreationTimestamp="2025-12-05 12:13:04 +0000 UTC" firstStartedPulling="2025-12-05 12:13:05.607955095 +0000 UTC m=+1470.100669818" lastFinishedPulling="2025-12-05 12:13:08.475212971 +0000 UTC m=+1472.967927694" observedRunningTime="2025-12-05 12:13:08.745574005 +0000 UTC m=+1473.238288738" watchObservedRunningTime="2025-12-05 12:13:08.753495999 +0000 UTC m=+1473.246210722" Dec 05 12:13:09 crc kubenswrapper[4763]: I1205 12:13:09.103474 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:13:09 crc kubenswrapper[4763]: I1205 12:13:09.103554 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 12:13:11 crc kubenswrapper[4763]: I1205 12:13:11.042460 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:13:11 crc kubenswrapper[4763]: I1205 12:13:11.043005 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 12:13:12 crc kubenswrapper[4763]: I1205 12:13:12.059931 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2ad0e748-bb1a-4b4f-bc70-f059e4fc3614" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:13:12 crc kubenswrapper[4763]: I1205 12:13:12.059947 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2ad0e748-bb1a-4b4f-bc70-f059e4fc3614" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 12:13:12 crc kubenswrapper[4763]: I1205 12:13:12.353301 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 12:13:12 crc kubenswrapper[4763]: I1205 12:13:12.385204 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 12:13:12 crc kubenswrapper[4763]: I1205 12:13:12.831213 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 12:13:14 crc kubenswrapper[4763]: I1205 12:13:14.103479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:13:14 crc kubenswrapper[4763]: I1205 12:13:14.103916 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 12:13:15 crc kubenswrapper[4763]: I1205 12:13:15.115886 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb8fec18-c1b6-47de-91cc-7ef68caceb0e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:13:15 crc kubenswrapper[4763]: I1205 12:13:15.115935 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb8fec18-c1b6-47de-91cc-7ef68caceb0e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.049030 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.049634 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.050123 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.050150 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.055613 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 12:13:21 crc kubenswrapper[4763]: I1205 12:13:21.055969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 12:13:24 crc kubenswrapper[4763]: I1205 12:13:24.110390 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 12:13:24 crc kubenswrapper[4763]: I1205 12:13:24.111223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 12:13:24 crc kubenswrapper[4763]: I1205 12:13:24.116333 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 12:13:24 crc kubenswrapper[4763]: I1205 12:13:24.117020 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 12:13:35 crc kubenswrapper[4763]: I1205 12:13:35.137635 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.286855 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.289981 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.299626 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.470071 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.470231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpj87\" (UniqueName: \"kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.470552 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.573114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpj87\" (UniqueName: \"kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.573291 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.573348 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.573824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.573903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.604905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpj87\" (UniqueName: \"kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87\") pod \"community-operators-khc7k\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:44 crc kubenswrapper[4763]: I1205 12:13:44.619680 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:45 crc kubenswrapper[4763]: I1205 12:13:45.164355 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:45 crc kubenswrapper[4763]: I1205 12:13:45.586666 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:46 crc kubenswrapper[4763]: I1205 12:13:46.157492 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerID="e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4" exitCode=0 Dec 05 12:13:46 crc kubenswrapper[4763]: I1205 12:13:46.158132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerDied","Data":"e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4"} Dec 05 12:13:46 crc kubenswrapper[4763]: I1205 12:13:46.158280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerStarted","Data":"3bf662ab69e9f80e0aec09d2cac097b25fff6ea1167d9d8b9d597c22a42657be"} Dec 05 12:13:46 crc kubenswrapper[4763]: I1205 12:13:46.481044 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:47 crc kubenswrapper[4763]: I1205 12:13:47.172090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerStarted","Data":"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853"} Dec 05 12:13:48 crc kubenswrapper[4763]: I1205 12:13:48.182367 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerID="b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853" exitCode=0 Dec 05 12:13:48 crc kubenswrapper[4763]: I1205 12:13:48.182440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerDied","Data":"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853"} Dec 05 12:13:49 crc kubenswrapper[4763]: I1205 12:13:49.195586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerStarted","Data":"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd"} Dec 05 12:13:49 crc kubenswrapper[4763]: I1205 12:13:49.233895 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khc7k" podStartSLOduration=2.813260142 podStartE2EDuration="5.233871159s" podCreationTimestamp="2025-12-05 12:13:44 +0000 UTC" firstStartedPulling="2025-12-05 12:13:46.159578364 +0000 UTC m=+1510.652293087" lastFinishedPulling="2025-12-05 12:13:48.580189381 +0000 UTC m=+1513.072904104" observedRunningTime="2025-12-05 12:13:49.221379228 +0000 UTC m=+1513.714093961" watchObservedRunningTime="2025-12-05 12:13:49.233871159 +0000 UTC m=+1513.726585882" Dec 05 12:13:50 crc kubenswrapper[4763]: I1205 12:13:50.403345 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="rabbitmq" containerID="cri-o://1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118" gracePeriod=604796 Dec 05 12:13:51 crc kubenswrapper[4763]: I1205 12:13:51.036622 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="rabbitmq" containerID="cri-o://a6d6aef1f2859e2fa062d3ea2ad9ed014fe71beeda34557c6f7694d8291c17f1" gracePeriod=604796 Dec 05 12:13:54 crc kubenswrapper[4763]: I1205 12:13:54.620914 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:54 crc kubenswrapper[4763]: I1205 12:13:54.621571 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:54 crc kubenswrapper[4763]: I1205 12:13:54.676081 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:55 crc kubenswrapper[4763]: I1205 12:13:55.313297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:55 crc kubenswrapper[4763]: I1205 12:13:55.357099 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.020964 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kqq9\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140226 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140358 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140394 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140531 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140569 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.140588 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd\") pod \"342a4872-4478-4b3a-a984-7fd457348435\" (UID: \"342a4872-4478-4b3a-a984-7fd457348435\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.141063 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.144183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.144597 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.155327 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.165606 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.167217 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info" (OuterVolumeSpecName: "pod-info") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.172853 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.173978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9" (OuterVolumeSpecName: "kube-api-access-5kqq9") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "kube-api-access-5kqq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.201108 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data" (OuterVolumeSpecName: "config-data") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244608 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244662 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244676 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/342a4872-4478-4b3a-a984-7fd457348435-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244707 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244721 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244732 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244742 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244752 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kqq9\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-kube-api-access-5kqq9\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.244778 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/342a4872-4478-4b3a-a984-7fd457348435-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.267128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf" (OuterVolumeSpecName: "server-conf") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.273335 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.306965 4763 generic.go:334] "Generic (PLEG): container finished" podID="85c70640-8bf7-419d-a96f-69ac3278710c" containerID="a6d6aef1f2859e2fa062d3ea2ad9ed014fe71beeda34557c6f7694d8291c17f1" exitCode=0 Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.307150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerDied","Data":"a6d6aef1f2859e2fa062d3ea2ad9ed014fe71beeda34557c6f7694d8291c17f1"} Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313044 4763 generic.go:334] "Generic (PLEG): container finished" podID="342a4872-4478-4b3a-a984-7fd457348435" containerID="1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118" exitCode=0 Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313416 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khc7k" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="registry-server" containerID="cri-o://7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd" gracePeriod=2 Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerDied","Data":"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118"} Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313576 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"342a4872-4478-4b3a-a984-7fd457348435","Type":"ContainerDied","Data":"1b8bce4dccb5009d80d93bdc31b23ae5c5b29d1c10c18710b83fbe9d927a3cc0"} Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313594 4763 scope.go:117] "RemoveContainer" containerID="1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.313731 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.347166 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/342a4872-4478-4b3a-a984-7fd457348435-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.347195 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.353800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "342a4872-4478-4b3a-a984-7fd457348435" (UID: "342a4872-4478-4b3a-a984-7fd457348435"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.442089 4763 scope.go:117] "RemoveContainer" containerID="e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.456569 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/342a4872-4478-4b3a-a984-7fd457348435-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.474371 4763 scope.go:117] "RemoveContainer" containerID="1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118" Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.475112 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118\": container with ID starting with 1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118 not found: ID does not exist" containerID="1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.475150 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118"} err="failed to get container status \"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118\": rpc error: code = NotFound desc = could not find container \"1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118\": container with ID starting with 1a787be5d24ee0521b1c21b4227dad5d8d4cc5fa74f51d2fcc376b30b77fa118 not found: ID does not exist" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.475176 4763 scope.go:117] "RemoveContainer" containerID="e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2" Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.475666 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2\": container with ID starting with e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2 not found: ID does not exist" containerID="e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.475700 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2"} err="failed to get container status \"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2\": rpc error: code = NotFound desc = could not find container \"e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2\": container with ID starting with e5d4bd0ecdf9f2d64e7b76f1cc29940eb5ff28a3be38447d8b2ea6c36682c0b2 not found: ID does not exist" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.660644 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.695723 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.773535 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.780252 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="setup-container" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.780299 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="setup-container" Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.780380 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.780393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.786034 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.792756 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="342a4872-4478-4b3a-a984-7fd457348435" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.793427 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.793446 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: E1205 12:13:57.793500 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="setup-container" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.793511 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="setup-container" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.794062 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" containerName="rabbitmq" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.802150 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.808972 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.809279 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.809525 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.809662 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.813635 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.814126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7gw5s" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.814206 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.852581 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="342a4872-4478-4b3a-a984-7fd457348435" path="/var/lib/kubelet/pods/342a4872-4478-4b3a-a984-7fd457348435/volumes" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.858350 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.888835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.888914 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9dm\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.888941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.888964 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.888980 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889187 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889216 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889603 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf\") pod \"85c70640-8bf7-419d-a96f-69ac3278710c\" (UID: \"85c70640-8bf7-419d-a96f-69ac3278710c\") " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889895 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2041e23-d29c-4a1a-9787-aa0e19c9f764-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.889985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890017 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890044 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ljd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-kube-api-access-v8ljd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890075 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2041e23-d29c-4a1a-9787-aa0e19c9f764-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.890657 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.894236 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.894476 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.897052 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm" (OuterVolumeSpecName: "kube-api-access-7s9dm") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "kube-api-access-7s9dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.897196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.898478 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.898679 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info" (OuterVolumeSpecName: "pod-info") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.908980 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.943462 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data" (OuterVolumeSpecName: "config-data") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.992831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2041e23-d29c-4a1a-9787-aa0e19c9f764-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.992910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.992978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.993030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.993063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2041e23-d29c-4a1a-9787-aa0e19c9f764-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.993984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ljd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-kube-api-access-v8ljd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994556 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994574 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9dm\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-kube-api-access-7s9dm\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994586 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85c70640-8bf7-419d-a96f-69ac3278710c-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994597 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85c70640-8bf7-419d-a96f-69ac3278710c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994609 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994622 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994633 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994655 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994666 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:57 crc kubenswrapper[4763]: I1205 12:13:57.994888 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:57.999897 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.000081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.001684 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.001753 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.002301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2041e23-d29c-4a1a-9787-aa0e19c9f764-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.002414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.003565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2041e23-d29c-4a1a-9787-aa0e19c9f764-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.019931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.025922 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ljd\" (UniqueName: \"kubernetes.io/projected/c2041e23-d29c-4a1a-9787-aa0e19c9f764-kube-api-access-v8ljd\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.038645 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2041e23-d29c-4a1a-9787-aa0e19c9f764-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.039238 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf" (OuterVolumeSpecName: "server-conf") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.048899 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.080586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"c2041e23-d29c-4a1a-9787-aa0e19c9f764\") " pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.085735 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.096272 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85c70640-8bf7-419d-a96f-69ac3278710c-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.096300 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.119416 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "85c70640-8bf7-419d-a96f-69ac3278710c" (UID: "85c70640-8bf7-419d-a96f-69ac3278710c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.156957 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.197637 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities\") pod \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.197886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content\") pod \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.198029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpj87\" (UniqueName: \"kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87\") pod \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\" (UID: \"8ab9075b-3dbf-451f-bffb-7f76f79bab6c\") " Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.198611 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85c70640-8bf7-419d-a96f-69ac3278710c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.200209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities" (OuterVolumeSpecName: "utilities") pod "8ab9075b-3dbf-451f-bffb-7f76f79bab6c" (UID: "8ab9075b-3dbf-451f-bffb-7f76f79bab6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.204140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87" (OuterVolumeSpecName: "kube-api-access-xpj87") pod "8ab9075b-3dbf-451f-bffb-7f76f79bab6c" (UID: "8ab9075b-3dbf-451f-bffb-7f76f79bab6c"). InnerVolumeSpecName "kube-api-access-xpj87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.256137 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ab9075b-3dbf-451f-bffb-7f76f79bab6c" (UID: "8ab9075b-3dbf-451f-bffb-7f76f79bab6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.300466 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.300507 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpj87\" (UniqueName: \"kubernetes.io/projected/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-kube-api-access-xpj87\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.300517 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab9075b-3dbf-451f-bffb-7f76f79bab6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.332960 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerID="7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd" exitCode=0 Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.333029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerDied","Data":"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd"} Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.333062 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khc7k" event={"ID":"8ab9075b-3dbf-451f-bffb-7f76f79bab6c","Type":"ContainerDied","Data":"3bf662ab69e9f80e0aec09d2cac097b25fff6ea1167d9d8b9d597c22a42657be"} Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.333099 4763 scope.go:117] "RemoveContainer" containerID="7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.333527 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khc7k" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.338751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"85c70640-8bf7-419d-a96f-69ac3278710c","Type":"ContainerDied","Data":"237de87d14b63e636c69c36d1a31e91b68864802a658ea9059660bcbae9092d4"} Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.338908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.370634 4763 scope.go:117] "RemoveContainer" containerID="b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.415438 4763 scope.go:117] "RemoveContainer" containerID="e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.418449 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.441863 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khc7k"] Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.473128 4763 scope.go:117] "RemoveContainer" containerID="7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd" Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.473643 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd\": container with ID starting with 7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd not found: ID does not exist" containerID="7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.473967 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd"} err="failed to get container status \"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd\": rpc error: code = NotFound desc = could not find container \"7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd\": container with ID starting with 7c59c1953749410133d07d8ea85c54cb8ba0101815152bdcba11a57ab52eb3cd not found: ID does not exist" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.473998 4763 scope.go:117] "RemoveContainer" containerID="b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.474060 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.474291 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853\": container with ID starting with b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853 not found: ID does not exist" containerID="b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.474334 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853"} err="failed to get container status \"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853\": rpc error: code = NotFound desc = could not find container \"b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853\": container with ID starting with b60e39bcc6fdaef0e30d4ad40d35e3db3e379c6e6cb7cb1112516bb502544853 not found: ID does not exist" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.474385 4763 scope.go:117] "RemoveContainer" containerID="e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4" Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.474922 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4\": container with ID starting with e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4 not found: ID does not exist" containerID="e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.474955 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4"} err="failed to get container status \"e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4\": rpc error: code = NotFound desc = could not find container \"e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4\": container with ID starting with e590e6a402372c5efbcaaeb6e0d7cafac1e13024d153df9b04e2a779b877bef4 not found: ID does not exist" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.474975 4763 scope.go:117] "RemoveContainer" containerID="a6d6aef1f2859e2fa062d3ea2ad9ed014fe71beeda34557c6f7694d8291c17f1" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.486533 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.496800 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.497486 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="registry-server" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.497504 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="registry-server" Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.497529 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="extract-utilities" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.497556 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="extract-utilities" Dec 05 12:13:58 crc kubenswrapper[4763]: E1205 12:13:58.497571 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="extract-content" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.497577 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="extract-content" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.497877 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" containerName="registry-server" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.499322 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.502228 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.502346 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.502714 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.502878 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.503023 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.503123 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6cw" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.503564 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.514287 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.516453 4763 scope.go:117] "RemoveContainer" containerID="a066dfaf62c7d9eb5aa71115e560824faff700d177e6a7a635728162a869c5e8" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627263 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.627470 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7lq\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-kube-api-access-7p7lq\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.705477 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729511 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7lq\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-kube-api-access-7p7lq\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.729846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.730430 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.730730 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.730889 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.731248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.731396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.732978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.735508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.737486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.738816 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.739244 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.752485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7lq\" (UniqueName: \"kubernetes.io/projected/0c59087a-448f-41c2-a85b-6ccd0ddbecc1-kube-api-access-7p7lq\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.777656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0c59087a-448f-41c2-a85b-6ccd0ddbecc1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:58 crc kubenswrapper[4763]: I1205 12:13:58.837973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:13:59 crc kubenswrapper[4763]: I1205 12:13:59.327016 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 12:13:59 crc kubenswrapper[4763]: I1205 12:13:59.354795 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2041e23-d29c-4a1a-9787-aa0e19c9f764","Type":"ContainerStarted","Data":"cc7462753ed1ed57f6aa654446eb5b91c120d1d722991e114009878f4ec6e3b5"} Dec 05 12:13:59 crc kubenswrapper[4763]: I1205 12:13:59.356936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0c59087a-448f-41c2-a85b-6ccd0ddbecc1","Type":"ContainerStarted","Data":"dbcda44849581fdde86c9e12df102322fc9de1ec308cbf37f423963ada4340c1"} Dec 05 12:13:59 crc kubenswrapper[4763]: I1205 12:13:59.803599 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c70640-8bf7-419d-a96f-69ac3278710c" path="/var/lib/kubelet/pods/85c70640-8bf7-419d-a96f-69ac3278710c/volumes" Dec 05 12:13:59 crc kubenswrapper[4763]: I1205 12:13:59.804633 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab9075b-3dbf-451f-bffb-7f76f79bab6c" path="/var/lib/kubelet/pods/8ab9075b-3dbf-451f-bffb-7f76f79bab6c/volumes" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.275789 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.280253 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.284442 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.291036 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.374702 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0c59087a-448f-41c2-a85b-6ccd0ddbecc1","Type":"ContainerStarted","Data":"0b7a3c867b54e7a96fdf3df72dc3b131df4361b5dac542c569a1b2ef6c7a5be4"} Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.376675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2041e23-d29c-4a1a-9787-aa0e19c9f764","Type":"ContainerStarted","Data":"b38f6af3c86ef860d12e1fc0f5ac44d693cd73614f4c6c926ccd03a799d9b90f"} Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398520 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398602 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmppz\" (UniqueName: \"kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398630 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.398798 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.500808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmppz\" (UniqueName: \"kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501167 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501246 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.501962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502622 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502658 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.502994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.520281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmppz\" (UniqueName: \"kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz\") pod \"dnsmasq-dns-d558885bc-m2pwl\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:01 crc kubenswrapper[4763]: I1205 12:14:01.606618 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:02 crc kubenswrapper[4763]: W1205 12:14:02.107306 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb72005_8ed5_4b96_aee9_65f6991620ea.slice/crio-1d7828f15a2d80a8aae0d6cacf0398f6cec65d19917be949f84f6fe9cf439d39 WatchSource:0}: Error finding container 1d7828f15a2d80a8aae0d6cacf0398f6cec65d19917be949f84f6fe9cf439d39: Status 404 returned error can't find the container with id 1d7828f15a2d80a8aae0d6cacf0398f6cec65d19917be949f84f6fe9cf439d39 Dec 05 12:14:02 crc kubenswrapper[4763]: I1205 12:14:02.121879 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:02 crc kubenswrapper[4763]: I1205 12:14:02.398982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" event={"ID":"8eb72005-8ed5-4b96-aee9-65f6991620ea","Type":"ContainerStarted","Data":"1d7828f15a2d80a8aae0d6cacf0398f6cec65d19917be949f84f6fe9cf439d39"} Dec 05 12:14:03 crc kubenswrapper[4763]: I1205 12:14:03.410069 4763 generic.go:334] "Generic (PLEG): container finished" podID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerID="0281a0db50e37b8fb0c94c9d0cdbb071965471e7e959125b74a9fa57f5f1616a" exitCode=0 Dec 05 12:14:03 crc kubenswrapper[4763]: I1205 12:14:03.410182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" event={"ID":"8eb72005-8ed5-4b96-aee9-65f6991620ea","Type":"ContainerDied","Data":"0281a0db50e37b8fb0c94c9d0cdbb071965471e7e959125b74a9fa57f5f1616a"} Dec 05 12:14:04 crc kubenswrapper[4763]: I1205 12:14:04.421439 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" event={"ID":"8eb72005-8ed5-4b96-aee9-65f6991620ea","Type":"ContainerStarted","Data":"e0c66f866788c615f5fdc63c8dfe43e940715c25f149e12bf9a43d5d2b4d4711"} Dec 05 12:14:04 crc kubenswrapper[4763]: I1205 12:14:04.421722 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:04 crc kubenswrapper[4763]: I1205 12:14:04.450540 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" podStartSLOduration=3.450510343 podStartE2EDuration="3.450510343s" podCreationTimestamp="2025-12-05 12:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:14:04.439900142 +0000 UTC m=+1528.932614875" watchObservedRunningTime="2025-12-05 12:14:04.450510343 +0000 UTC m=+1528.943225076" Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.608597 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.760185 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.760750 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="dnsmasq-dns" containerID="cri-o://28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51" gracePeriod=10 Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.924757 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-qtlzv"] Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.937305 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:11 crc kubenswrapper[4763]: I1205 12:14:11.979822 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-qtlzv"] Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.110397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.110488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.110548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.110733 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-config\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.110888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.111020 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjm4\" (UniqueName: \"kubernetes.io/projected/dd604323-58e1-439a-b0a4-66ad626de5a6-kube-api-access-jkjm4\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.111091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.212702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-config\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213922 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjm4\" (UniqueName: \"kubernetes.io/projected/dd604323-58e1-439a-b0a4-66ad626de5a6-kube-api-access-jkjm4\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.213964 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.214038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.214516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.215043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-config\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.215234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.215472 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.216026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd604323-58e1-439a-b0a4-66ad626de5a6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.264096 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjm4\" (UniqueName: \"kubernetes.io/projected/dd604323-58e1-439a-b0a4-66ad626de5a6-kube-api-access-jkjm4\") pod \"dnsmasq-dns-6b6dc74c5-qtlzv\" (UID: \"dd604323-58e1-439a-b0a4-66ad626de5a6\") " pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.315349 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.458160 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.534216 4763 generic.go:334] "Generic (PLEG): container finished" podID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerID="28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51" exitCode=0 Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.534417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" event={"ID":"c7f9c67e-8dbf-4604-9407-ba5199add7e2","Type":"ContainerDied","Data":"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51"} Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.534606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" event={"ID":"c7f9c67e-8dbf-4604-9407-ba5199add7e2","Type":"ContainerDied","Data":"641d46d2499535761a09210beced70e169ba66f6214c6a4e4149fd56efe05984"} Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.534644 4763 scope.go:117] "RemoveContainer" containerID="28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.534986 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mg9lr" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.569992 4763 scope.go:117] "RemoveContainer" containerID="82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.605946 4763 scope.go:117] "RemoveContainer" containerID="28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51" Dec 05 12:14:12 crc kubenswrapper[4763]: E1205 12:14:12.609264 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51\": container with ID starting with 28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51 not found: ID does not exist" containerID="28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.609321 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51"} err="failed to get container status \"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51\": rpc error: code = NotFound desc = could not find container \"28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51\": container with ID starting with 28aeb4201c45082678488c5377a01647a0e08013901f0158bd0fe9ca3e689a51 not found: ID does not exist" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.609347 4763 scope.go:117] "RemoveContainer" containerID="82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557" Dec 05 12:14:12 crc kubenswrapper[4763]: E1205 12:14:12.609874 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557\": container with ID starting with 82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557 not found: ID does not exist" containerID="82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.609903 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557"} err="failed to get container status \"82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557\": rpc error: code = NotFound desc = could not find container \"82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557\": container with ID starting with 82897be6d18262c29fcdd8b02d568388c988635a35ba7ab874032fb7df8a9557 not found: ID does not exist" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.622671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.622870 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.622935 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.623041 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.623073 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qzw\" (UniqueName: \"kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.623205 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb\") pod \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\" (UID: \"c7f9c67e-8dbf-4604-9407-ba5199add7e2\") " Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.634113 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw" (OuterVolumeSpecName: "kube-api-access-q9qzw") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "kube-api-access-q9qzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.696563 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.716545 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.721785 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config" (OuterVolumeSpecName: "config") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.725683 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.725735 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.725783 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.725801 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qzw\" (UniqueName: \"kubernetes.io/projected/c7f9c67e-8dbf-4604-9407-ba5199add7e2-kube-api-access-q9qzw\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.733663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.743064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7f9c67e-8dbf-4604-9407-ba5199add7e2" (UID: "c7f9c67e-8dbf-4604-9407-ba5199add7e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.828396 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.828444 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7f9c67e-8dbf-4604-9407-ba5199add7e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:12 crc kubenswrapper[4763]: W1205 12:14:12.858406 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd604323_58e1_439a_b0a4_66ad626de5a6.slice/crio-7627031aae816c43d7f7e4a0424380f910a98ac3551f785c86396ca11fe3eaf0 WatchSource:0}: Error finding container 7627031aae816c43d7f7e4a0424380f910a98ac3551f785c86396ca11fe3eaf0: Status 404 returned error can't find the container with id 7627031aae816c43d7f7e4a0424380f910a98ac3551f785c86396ca11fe3eaf0 Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.862497 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-qtlzv"] Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.885821 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:14:12 crc kubenswrapper[4763]: I1205 12:14:12.897608 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mg9lr"] Dec 05 12:14:13 crc kubenswrapper[4763]: I1205 12:14:13.546969 4763 generic.go:334] "Generic (PLEG): container finished" podID="dd604323-58e1-439a-b0a4-66ad626de5a6" containerID="fda874727f4feec6b5911a2a001de6f806020c88ede25486ad4983252c56f768" exitCode=0 Dec 05 12:14:13 crc kubenswrapper[4763]: I1205 12:14:13.547086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" event={"ID":"dd604323-58e1-439a-b0a4-66ad626de5a6","Type":"ContainerDied","Data":"fda874727f4feec6b5911a2a001de6f806020c88ede25486ad4983252c56f768"} Dec 05 12:14:13 crc kubenswrapper[4763]: I1205 12:14:13.547224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" event={"ID":"dd604323-58e1-439a-b0a4-66ad626de5a6","Type":"ContainerStarted","Data":"7627031aae816c43d7f7e4a0424380f910a98ac3551f785c86396ca11fe3eaf0"} Dec 05 12:14:13 crc kubenswrapper[4763]: I1205 12:14:13.798420 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" path="/var/lib/kubelet/pods/c7f9c67e-8dbf-4604-9407-ba5199add7e2/volumes" Dec 05 12:14:14 crc kubenswrapper[4763]: I1205 12:14:14.563403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" event={"ID":"dd604323-58e1-439a-b0a4-66ad626de5a6","Type":"ContainerStarted","Data":"23fa2497a4200bc3cee037db1c7167f8269a7b81cd8f2450205b6c679d354a3f"} Dec 05 12:14:14 crc kubenswrapper[4763]: I1205 12:14:14.563952 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:14 crc kubenswrapper[4763]: I1205 12:14:14.596560 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" podStartSLOduration=3.596532803 podStartE2EDuration="3.596532803s" podCreationTimestamp="2025-12-05 12:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:14:14.589310441 +0000 UTC m=+1539.082025164" watchObservedRunningTime="2025-12-05 12:14:14.596532803 +0000 UTC m=+1539.089247526" Dec 05 12:14:22 crc kubenswrapper[4763]: I1205 12:14:22.317836 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6dc74c5-qtlzv" Dec 05 12:14:22 crc kubenswrapper[4763]: I1205 12:14:22.384639 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:22 crc kubenswrapper[4763]: I1205 12:14:22.384938 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="dnsmasq-dns" containerID="cri-o://e0c66f866788c615f5fdc63c8dfe43e940715c25f149e12bf9a43d5d2b4d4711" gracePeriod=10 Dec 05 12:14:22 crc kubenswrapper[4763]: I1205 12:14:22.646408 4763 generic.go:334] "Generic (PLEG): container finished" podID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerID="e0c66f866788c615f5fdc63c8dfe43e940715c25f149e12bf9a43d5d2b4d4711" exitCode=0 Dec 05 12:14:22 crc kubenswrapper[4763]: I1205 12:14:22.646458 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" event={"ID":"8eb72005-8ed5-4b96-aee9-65f6991620ea","Type":"ContainerDied","Data":"e0c66f866788c615f5fdc63c8dfe43e940715c25f149e12bf9a43d5d2b4d4711"} Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.335165 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.441089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.441233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.441294 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.441431 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmppz\" (UniqueName: \"kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.442055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.442089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.442128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb\") pod \"8eb72005-8ed5-4b96-aee9-65f6991620ea\" (UID: \"8eb72005-8ed5-4b96-aee9-65f6991620ea\") " Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.446810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz" (OuterVolumeSpecName: "kube-api-access-nmppz") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "kube-api-access-nmppz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.495267 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.499517 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.500601 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.504881 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config" (OuterVolumeSpecName: "config") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.505815 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.518662 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8eb72005-8ed5-4b96-aee9-65f6991620ea" (UID: "8eb72005-8ed5-4b96-aee9-65f6991620ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544634 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544668 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544679 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544689 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544699 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmppz\" (UniqueName: \"kubernetes.io/projected/8eb72005-8ed5-4b96-aee9-65f6991620ea-kube-api-access-nmppz\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544708 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.544716 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eb72005-8ed5-4b96-aee9-65f6991620ea-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.678306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" event={"ID":"8eb72005-8ed5-4b96-aee9-65f6991620ea","Type":"ContainerDied","Data":"1d7828f15a2d80a8aae0d6cacf0398f6cec65d19917be949f84f6fe9cf439d39"} Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.678374 4763 scope.go:117] "RemoveContainer" containerID="e0c66f866788c615f5fdc63c8dfe43e940715c25f149e12bf9a43d5d2b4d4711" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.684206 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-m2pwl" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.737573 4763 scope.go:117] "RemoveContainer" containerID="0281a0db50e37b8fb0c94c9d0cdbb071965471e7e959125b74a9fa57f5f1616a" Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.815888 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:23 crc kubenswrapper[4763]: I1205 12:14:23.827733 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-m2pwl"] Dec 05 12:14:25 crc kubenswrapper[4763]: I1205 12:14:25.794640 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" path="/var/lib/kubelet/pods/8eb72005-8ed5-4b96-aee9-65f6991620ea/volumes" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.630118 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:27 crc kubenswrapper[4763]: E1205 12:14:27.630932 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="init" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.630949 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="init" Dec 05 12:14:27 crc kubenswrapper[4763]: E1205 12:14:27.630996 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="init" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.631005 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="init" Dec 05 12:14:27 crc kubenswrapper[4763]: E1205 12:14:27.631015 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.631023 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: E1205 12:14:27.631044 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.631052 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.631273 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb72005-8ed5-4b96-aee9-65f6991620ea" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.631330 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f9c67e-8dbf-4604-9407-ba5199add7e2" containerName="dnsmasq-dns" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.633225 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.641919 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.735682 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.735804 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gkn4\" (UniqueName: \"kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.735832 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.837785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gkn4\" (UniqueName: \"kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.837843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.837940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.838565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.838593 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.863491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gkn4\" (UniqueName: \"kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4\") pod \"redhat-marketplace-6dl6d\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:27 crc kubenswrapper[4763]: I1205 12:14:27.967907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:28 crc kubenswrapper[4763]: I1205 12:14:28.457714 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:28 crc kubenswrapper[4763]: I1205 12:14:28.726148 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerID="4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c" exitCode=0 Dec 05 12:14:28 crc kubenswrapper[4763]: I1205 12:14:28.726212 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerDied","Data":"4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c"} Dec 05 12:14:28 crc kubenswrapper[4763]: I1205 12:14:28.726526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerStarted","Data":"6b4c19310ff0ef6cebc1ed9868ee101af25573d98b7e57711b94b40f76129d5c"} Dec 05 12:14:29 crc kubenswrapper[4763]: I1205 12:14:29.736619 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerID="0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc" exitCode=0 Dec 05 12:14:29 crc kubenswrapper[4763]: I1205 12:14:29.736735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerDied","Data":"0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc"} Dec 05 12:14:30 crc kubenswrapper[4763]: I1205 12:14:30.750398 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerStarted","Data":"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285"} Dec 05 12:14:30 crc kubenswrapper[4763]: I1205 12:14:30.778305 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dl6d" podStartSLOduration=2.328368023 podStartE2EDuration="3.778282569s" podCreationTimestamp="2025-12-05 12:14:27 +0000 UTC" firstStartedPulling="2025-12-05 12:14:28.727947276 +0000 UTC m=+1553.220661999" lastFinishedPulling="2025-12-05 12:14:30.177861822 +0000 UTC m=+1554.670576545" observedRunningTime="2025-12-05 12:14:30.768147544 +0000 UTC m=+1555.260862267" watchObservedRunningTime="2025-12-05 12:14:30.778282569 +0000 UTC m=+1555.270997282" Dec 05 12:14:33 crc kubenswrapper[4763]: I1205 12:14:33.781297 4763 generic.go:334] "Generic (PLEG): container finished" podID="c2041e23-d29c-4a1a-9787-aa0e19c9f764" containerID="b38f6af3c86ef860d12e1fc0f5ac44d693cd73614f4c6c926ccd03a799d9b90f" exitCode=0 Dec 05 12:14:33 crc kubenswrapper[4763]: I1205 12:14:33.781350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2041e23-d29c-4a1a-9787-aa0e19c9f764","Type":"ContainerDied","Data":"b38f6af3c86ef860d12e1fc0f5ac44d693cd73614f4c6c926ccd03a799d9b90f"} Dec 05 12:14:33 crc kubenswrapper[4763]: I1205 12:14:33.784443 4763 generic.go:334] "Generic (PLEG): container finished" podID="0c59087a-448f-41c2-a85b-6ccd0ddbecc1" containerID="0b7a3c867b54e7a96fdf3df72dc3b131df4361b5dac542c569a1b2ef6c7a5be4" exitCode=0 Dec 05 12:14:33 crc kubenswrapper[4763]: I1205 12:14:33.796705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0c59087a-448f-41c2-a85b-6ccd0ddbecc1","Type":"ContainerDied","Data":"0b7a3c867b54e7a96fdf3df72dc3b131df4361b5dac542c569a1b2ef6c7a5be4"} Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.795071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0c59087a-448f-41c2-a85b-6ccd0ddbecc1","Type":"ContainerStarted","Data":"c3ba2df23eca084f18b8e22812acede6ba43681d77a17e05aac63aef0674923f"} Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.795683 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.797681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2041e23-d29c-4a1a-9787-aa0e19c9f764","Type":"ContainerStarted","Data":"279673e3a1f7953fefec0507ef601f01e10ce1d9ca998cc978a5ff655558c710"} Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.798005 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.837950 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.837928077 podStartE2EDuration="36.837928077s" podCreationTimestamp="2025-12-05 12:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:14:34.824971191 +0000 UTC m=+1559.317685924" watchObservedRunningTime="2025-12-05 12:14:34.837928077 +0000 UTC m=+1559.330642810" Dec 05 12:14:34 crc kubenswrapper[4763]: I1205 12:14:34.856343 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.856322848 podStartE2EDuration="37.856322848s" podCreationTimestamp="2025-12-05 12:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:14:34.850244593 +0000 UTC m=+1559.342959326" watchObservedRunningTime="2025-12-05 12:14:34.856322848 +0000 UTC m=+1559.349037571" Dec 05 12:14:37 crc kubenswrapper[4763]: I1205 12:14:37.544115 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:14:37 crc kubenswrapper[4763]: I1205 12:14:37.544434 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:14:37 crc kubenswrapper[4763]: I1205 12:14:37.976420 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:37 crc kubenswrapper[4763]: I1205 12:14:37.976822 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:38 crc kubenswrapper[4763]: I1205 12:14:38.025330 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:38 crc kubenswrapper[4763]: I1205 12:14:38.896336 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:38 crc kubenswrapper[4763]: I1205 12:14:38.944934 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.854448 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dl6d" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="registry-server" containerID="cri-o://b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285" gracePeriod=2 Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.954481 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz"] Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.956052 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.958512 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.959247 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.966419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:14:40 crc kubenswrapper[4763]: I1205 12:14:40.981090 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.003271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.003316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd6wn\" (UniqueName: \"kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.003479 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.003519 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.003982 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz"] Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.104831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.105072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.105154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd6wn\" (UniqueName: \"kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.105317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.126130 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.143658 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.152338 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.157478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd6wn\" (UniqueName: \"kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.404398 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.575648 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.616610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities\") pod \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.616869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gkn4\" (UniqueName: \"kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4\") pod \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.617004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content\") pod \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\" (UID: \"8a719aa5-6044-4da3-89dc-6d25ee382b3e\") " Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.620837 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities" (OuterVolumeSpecName: "utilities") pod "8a719aa5-6044-4da3-89dc-6d25ee382b3e" (UID: "8a719aa5-6044-4da3-89dc-6d25ee382b3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.631855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4" (OuterVolumeSpecName: "kube-api-access-5gkn4") pod "8a719aa5-6044-4da3-89dc-6d25ee382b3e" (UID: "8a719aa5-6044-4da3-89dc-6d25ee382b3e"). InnerVolumeSpecName "kube-api-access-5gkn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.656423 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a719aa5-6044-4da3-89dc-6d25ee382b3e" (UID: "8a719aa5-6044-4da3-89dc-6d25ee382b3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.719824 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gkn4\" (UniqueName: \"kubernetes.io/projected/8a719aa5-6044-4da3-89dc-6d25ee382b3e-kube-api-access-5gkn4\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.719871 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.719884 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a719aa5-6044-4da3-89dc-6d25ee382b3e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.871581 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerID="b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285" exitCode=0 Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.871632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerDied","Data":"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285"} Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.871652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dl6d" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.871670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dl6d" event={"ID":"8a719aa5-6044-4da3-89dc-6d25ee382b3e","Type":"ContainerDied","Data":"6b4c19310ff0ef6cebc1ed9868ee101af25573d98b7e57711b94b40f76129d5c"} Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.871713 4763 scope.go:117] "RemoveContainer" containerID="b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.900823 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.903446 4763 scope.go:117] "RemoveContainer" containerID="0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.916054 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dl6d"] Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.934838 4763 scope.go:117] "RemoveContainer" containerID="4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.963203 4763 scope.go:117] "RemoveContainer" containerID="b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285" Dec 05 12:14:41 crc kubenswrapper[4763]: E1205 12:14:41.968428 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285\": container with ID starting with b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285 not found: ID does not exist" containerID="b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.968488 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285"} err="failed to get container status \"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285\": rpc error: code = NotFound desc = could not find container \"b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285\": container with ID starting with b24313d6053798a1df938b75d3f2b48dd1fbef4b97f55d200d1f882780470285 not found: ID does not exist" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.968523 4763 scope.go:117] "RemoveContainer" containerID="0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc" Dec 05 12:14:41 crc kubenswrapper[4763]: E1205 12:14:41.969025 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc\": container with ID starting with 0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc not found: ID does not exist" containerID="0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.969079 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc"} err="failed to get container status \"0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc\": rpc error: code = NotFound desc = could not find container \"0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc\": container with ID starting with 0c366ed63572d1a2d7b23c7db754034f400cfe0af0800a8e7bc39f18c9167dbc not found: ID does not exist" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.969108 4763 scope.go:117] "RemoveContainer" containerID="4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c" Dec 05 12:14:41 crc kubenswrapper[4763]: E1205 12:14:41.975262 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c\": container with ID starting with 4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c not found: ID does not exist" containerID="4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c" Dec 05 12:14:41 crc kubenswrapper[4763]: I1205 12:14:41.975317 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c"} err="failed to get container status \"4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c\": rpc error: code = NotFound desc = could not find container \"4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c\": container with ID starting with 4a3b36e4505f6aac9b48286ce70e735cdf155cedd22b2e849f6ce5086666478c not found: ID does not exist" Dec 05 12:14:42 crc kubenswrapper[4763]: I1205 12:14:42.073703 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz"] Dec 05 12:14:42 crc kubenswrapper[4763]: I1205 12:14:42.888331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" event={"ID":"fd34c478-732e-49a0-ab4a-c35fdf054b3c","Type":"ContainerStarted","Data":"ca94f52673596f53a2cfaa6871dc7679dbe96b8dea4007be78933f74431bd7fb"} Dec 05 12:14:43 crc kubenswrapper[4763]: I1205 12:14:43.801448 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" path="/var/lib/kubelet/pods/8a719aa5-6044-4da3-89dc-6d25ee382b3e/volumes" Dec 05 12:14:48 crc kubenswrapper[4763]: I1205 12:14:48.166470 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 12:14:48 crc kubenswrapper[4763]: I1205 12:14:48.840980 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 12:14:54 crc kubenswrapper[4763]: I1205 12:14:54.014537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" event={"ID":"fd34c478-732e-49a0-ab4a-c35fdf054b3c","Type":"ContainerStarted","Data":"83d6eeb8fb0c08f5dd129b0f3b2020e06341d3135b38307a131b9fbf2d628289"} Dec 05 12:14:54 crc kubenswrapper[4763]: I1205 12:14:54.042803 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" podStartSLOduration=2.524713719 podStartE2EDuration="14.042779625s" podCreationTimestamp="2025-12-05 12:14:40 +0000 UTC" firstStartedPulling="2025-12-05 12:14:42.080470869 +0000 UTC m=+1566.573185592" lastFinishedPulling="2025-12-05 12:14:53.598536775 +0000 UTC m=+1578.091251498" observedRunningTime="2025-12-05 12:14:54.032104772 +0000 UTC m=+1578.524819505" watchObservedRunningTime="2025-12-05 12:14:54.042779625 +0000 UTC m=+1578.535494348" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.154271 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb"] Dec 05 12:15:00 crc kubenswrapper[4763]: E1205 12:15:00.155392 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="extract-content" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.155409 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="extract-content" Dec 05 12:15:00 crc kubenswrapper[4763]: E1205 12:15:00.155446 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="extract-utilities" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.155464 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="extract-utilities" Dec 05 12:15:00 crc kubenswrapper[4763]: E1205 12:15:00.155486 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="registry-server" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.155494 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="registry-server" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.155781 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a719aa5-6044-4da3-89dc-6d25ee382b3e" containerName="registry-server" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.156702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.163045 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.164245 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.165188 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb"] Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.329742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.329861 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.330055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pv6z\" (UniqueName: \"kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.432781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.432897 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.432966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pv6z\" (UniqueName: \"kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.434520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.439465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.451502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pv6z\" (UniqueName: \"kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z\") pod \"collect-profiles-29415615-rh6tb\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.493807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:00 crc kubenswrapper[4763]: I1205 12:15:00.976623 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb"] Dec 05 12:15:01 crc kubenswrapper[4763]: I1205 12:15:01.082978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" event={"ID":"9b1906b4-d122-45a8-9527-42266fa59f7c","Type":"ContainerStarted","Data":"78fde81df52ccfddbef9fe77b635c842f01acd0a6b5d3b17590d8987c57faf81"} Dec 05 12:15:02 crc kubenswrapper[4763]: I1205 12:15:02.108304 4763 generic.go:334] "Generic (PLEG): container finished" podID="9b1906b4-d122-45a8-9527-42266fa59f7c" containerID="d11b14a9e0f61e87c20f232dc92de7635106a4b26d9029763740a89ddaa98086" exitCode=0 Dec 05 12:15:02 crc kubenswrapper[4763]: I1205 12:15:02.108419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" event={"ID":"9b1906b4-d122-45a8-9527-42266fa59f7c","Type":"ContainerDied","Data":"d11b14a9e0f61e87c20f232dc92de7635106a4b26d9029763740a89ddaa98086"} Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.666307 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.803589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume\") pod \"9b1906b4-d122-45a8-9527-42266fa59f7c\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.803719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pv6z\" (UniqueName: \"kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z\") pod \"9b1906b4-d122-45a8-9527-42266fa59f7c\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.803939 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume\") pod \"9b1906b4-d122-45a8-9527-42266fa59f7c\" (UID: \"9b1906b4-d122-45a8-9527-42266fa59f7c\") " Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.804964 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b1906b4-d122-45a8-9527-42266fa59f7c" (UID: "9b1906b4-d122-45a8-9527-42266fa59f7c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.805588 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b1906b4-d122-45a8-9527-42266fa59f7c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.812026 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b1906b4-d122-45a8-9527-42266fa59f7c" (UID: "9b1906b4-d122-45a8-9527-42266fa59f7c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.813409 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z" (OuterVolumeSpecName: "kube-api-access-8pv6z") pod "9b1906b4-d122-45a8-9527-42266fa59f7c" (UID: "9b1906b4-d122-45a8-9527-42266fa59f7c"). InnerVolumeSpecName "kube-api-access-8pv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.924989 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b1906b4-d122-45a8-9527-42266fa59f7c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:03 crc kubenswrapper[4763]: I1205 12:15:03.925052 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pv6z\" (UniqueName: \"kubernetes.io/projected/9b1906b4-d122-45a8-9527-42266fa59f7c-kube-api-access-8pv6z\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:04 crc kubenswrapper[4763]: I1205 12:15:04.130773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" event={"ID":"9b1906b4-d122-45a8-9527-42266fa59f7c","Type":"ContainerDied","Data":"78fde81df52ccfddbef9fe77b635c842f01acd0a6b5d3b17590d8987c57faf81"} Dec 05 12:15:04 crc kubenswrapper[4763]: I1205 12:15:04.130820 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78fde81df52ccfddbef9fe77b635c842f01acd0a6b5d3b17590d8987c57faf81" Dec 05 12:15:04 crc kubenswrapper[4763]: I1205 12:15:04.130890 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb" Dec 05 12:15:06 crc kubenswrapper[4763]: I1205 12:15:06.150596 4763 generic.go:334] "Generic (PLEG): container finished" podID="fd34c478-732e-49a0-ab4a-c35fdf054b3c" containerID="83d6eeb8fb0c08f5dd129b0f3b2020e06341d3135b38307a131b9fbf2d628289" exitCode=0 Dec 05 12:15:06 crc kubenswrapper[4763]: I1205 12:15:06.150794 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" event={"ID":"fd34c478-732e-49a0-ab4a-c35fdf054b3c","Type":"ContainerDied","Data":"83d6eeb8fb0c08f5dd129b0f3b2020e06341d3135b38307a131b9fbf2d628289"} Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.544157 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.544750 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.606543 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.700530 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle\") pod \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.700590 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd6wn\" (UniqueName: \"kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn\") pod \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.700695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory\") pod \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.702995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key\") pod \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\" (UID: \"fd34c478-732e-49a0-ab4a-c35fdf054b3c\") " Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.706415 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fd34c478-732e-49a0-ab4a-c35fdf054b3c" (UID: "fd34c478-732e-49a0-ab4a-c35fdf054b3c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.720466 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn" (OuterVolumeSpecName: "kube-api-access-hd6wn") pod "fd34c478-732e-49a0-ab4a-c35fdf054b3c" (UID: "fd34c478-732e-49a0-ab4a-c35fdf054b3c"). InnerVolumeSpecName "kube-api-access-hd6wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.734677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory" (OuterVolumeSpecName: "inventory") pod "fd34c478-732e-49a0-ab4a-c35fdf054b3c" (UID: "fd34c478-732e-49a0-ab4a-c35fdf054b3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.736527 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd34c478-732e-49a0-ab4a-c35fdf054b3c" (UID: "fd34c478-732e-49a0-ab4a-c35fdf054b3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.806043 4763 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.806096 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd6wn\" (UniqueName: \"kubernetes.io/projected/fd34c478-732e-49a0-ab4a-c35fdf054b3c-kube-api-access-hd6wn\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.806116 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:07 crc kubenswrapper[4763]: I1205 12:15:07.806135 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd34c478-732e-49a0-ab4a-c35fdf054b3c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.172606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" event={"ID":"fd34c478-732e-49a0-ab4a-c35fdf054b3c","Type":"ContainerDied","Data":"ca94f52673596f53a2cfaa6871dc7679dbe96b8dea4007be78933f74431bd7fb"} Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.172894 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca94f52673596f53a2cfaa6871dc7679dbe96b8dea4007be78933f74431bd7fb" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.172704 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.334739 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4"] Dec 05 12:15:08 crc kubenswrapper[4763]: E1205 12:15:08.335487 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1906b4-d122-45a8-9527-42266fa59f7c" containerName="collect-profiles" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.335511 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1906b4-d122-45a8-9527-42266fa59f7c" containerName="collect-profiles" Dec 05 12:15:08 crc kubenswrapper[4763]: E1205 12:15:08.335535 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34c478-732e-49a0-ab4a-c35fdf054b3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.335544 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34c478-732e-49a0-ab4a-c35fdf054b3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.336201 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1906b4-d122-45a8-9527-42266fa59f7c" containerName="collect-profiles" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.336281 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34c478-732e-49a0-ab4a-c35fdf054b3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.337330 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.340722 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.340936 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.341940 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.342237 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.356968 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4"] Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.520908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.521013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl9f\" (UniqueName: \"kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.521149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.623497 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.623676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.623739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl9f\" (UniqueName: \"kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.630013 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.641074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.641927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl9f\" (UniqueName: \"kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-56jk4\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:08 crc kubenswrapper[4763]: I1205 12:15:08.695093 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:09 crc kubenswrapper[4763]: I1205 12:15:09.225219 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4"] Dec 05 12:15:10 crc kubenswrapper[4763]: I1205 12:15:10.191381 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" event={"ID":"783891d5-537e-4f2f-b3ee-326588a913f6","Type":"ContainerStarted","Data":"66db8e8d2edbdd48f574fdaeee6e5c63661d12e2c0b278de5c899b53bd5b4720"} Dec 05 12:15:10 crc kubenswrapper[4763]: I1205 12:15:10.191924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" event={"ID":"783891d5-537e-4f2f-b3ee-326588a913f6","Type":"ContainerStarted","Data":"cd25b796f99f1a707e98f988b4948ffaae3d41e428816aa423ae0cb8e327f1ec"} Dec 05 12:15:10 crc kubenswrapper[4763]: I1205 12:15:10.209997 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" podStartSLOduration=1.657172308 podStartE2EDuration="2.209976936s" podCreationTimestamp="2025-12-05 12:15:08 +0000 UTC" firstStartedPulling="2025-12-05 12:15:09.230870814 +0000 UTC m=+1593.723585537" lastFinishedPulling="2025-12-05 12:15:09.783675442 +0000 UTC m=+1594.276390165" observedRunningTime="2025-12-05 12:15:10.207200466 +0000 UTC m=+1594.699915189" watchObservedRunningTime="2025-12-05 12:15:10.209976936 +0000 UTC m=+1594.702691659" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.221222 4763 generic.go:334] "Generic (PLEG): container finished" podID="783891d5-537e-4f2f-b3ee-326588a913f6" containerID="66db8e8d2edbdd48f574fdaeee6e5c63661d12e2c0b278de5c899b53bd5b4720" exitCode=0 Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.221303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" event={"ID":"783891d5-537e-4f2f-b3ee-326588a913f6","Type":"ContainerDied","Data":"66db8e8d2edbdd48f574fdaeee6e5c63661d12e2c0b278de5c899b53bd5b4720"} Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.475460 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.478023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.486893 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.616604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.616748 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.616817 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7n5\" (UniqueName: \"kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.718478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.718588 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.718615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7n5\" (UniqueName: \"kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.719136 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.719204 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.737953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7n5\" (UniqueName: \"kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5\") pod \"certified-operators-b8txf\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:13 crc kubenswrapper[4763]: I1205 12:15:13.807933 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.316609 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:14 crc kubenswrapper[4763]: W1205 12:15:14.327975 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf4b6edc_23e2_403b_afdb_65badf3ed399.slice/crio-7bd62ce18f42a2b60bee68fb18a9935272c40af5c4df4d24c0d54a66bd178a5d WatchSource:0}: Error finding container 7bd62ce18f42a2b60bee68fb18a9935272c40af5c4df4d24c0d54a66bd178a5d: Status 404 returned error can't find the container with id 7bd62ce18f42a2b60bee68fb18a9935272c40af5c4df4d24c0d54a66bd178a5d Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.564431 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.741480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory\") pod \"783891d5-537e-4f2f-b3ee-326588a913f6\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.741579 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key\") pod \"783891d5-537e-4f2f-b3ee-326588a913f6\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.741598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl9f\" (UniqueName: \"kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f\") pod \"783891d5-537e-4f2f-b3ee-326588a913f6\" (UID: \"783891d5-537e-4f2f-b3ee-326588a913f6\") " Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.750244 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f" (OuterVolumeSpecName: "kube-api-access-9hl9f") pod "783891d5-537e-4f2f-b3ee-326588a913f6" (UID: "783891d5-537e-4f2f-b3ee-326588a913f6"). InnerVolumeSpecName "kube-api-access-9hl9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.775941 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory" (OuterVolumeSpecName: "inventory") pod "783891d5-537e-4f2f-b3ee-326588a913f6" (UID: "783891d5-537e-4f2f-b3ee-326588a913f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.786016 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "783891d5-537e-4f2f-b3ee-326588a913f6" (UID: "783891d5-537e-4f2f-b3ee-326588a913f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.844376 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.844447 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl9f\" (UniqueName: \"kubernetes.io/projected/783891d5-537e-4f2f-b3ee-326588a913f6-kube-api-access-9hl9f\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:14 crc kubenswrapper[4763]: I1205 12:15:14.844462 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/783891d5-537e-4f2f-b3ee-326588a913f6-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.241001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" event={"ID":"783891d5-537e-4f2f-b3ee-326588a913f6","Type":"ContainerDied","Data":"cd25b796f99f1a707e98f988b4948ffaae3d41e428816aa423ae0cb8e327f1ec"} Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.241033 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-56jk4" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.241058 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd25b796f99f1a707e98f988b4948ffaae3d41e428816aa423ae0cb8e327f1ec" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.244137 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerID="6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196" exitCode=0 Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.244174 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerDied","Data":"6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196"} Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.244196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerStarted","Data":"7bd62ce18f42a2b60bee68fb18a9935272c40af5c4df4d24c0d54a66bd178a5d"} Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.343065 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2"] Dec 05 12:15:15 crc kubenswrapper[4763]: E1205 12:15:15.343690 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783891d5-537e-4f2f-b3ee-326588a913f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.343713 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="783891d5-537e-4f2f-b3ee-326588a913f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.343947 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="783891d5-537e-4f2f-b3ee-326588a913f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.344740 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2"] Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.344862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.385056 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.385456 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.385695 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.385704 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.397191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.397381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rw7\" (UniqueName: \"kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.397483 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.397544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.499361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.499431 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.499505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.499586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rw7\" (UniqueName: \"kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.504598 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.505389 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.505532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.517159 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rw7\" (UniqueName: \"kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:15 crc kubenswrapper[4763]: I1205 12:15:15.717711 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:15:16 crc kubenswrapper[4763]: W1205 12:15:16.300859 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda28052ee_43d2_4618_a981_ef115a2c3a00.slice/crio-3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8 WatchSource:0}: Error finding container 3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8: Status 404 returned error can't find the container with id 3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8 Dec 05 12:15:16 crc kubenswrapper[4763]: I1205 12:15:16.320578 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2"] Dec 05 12:15:17 crc kubenswrapper[4763]: I1205 12:15:17.266230 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerID="563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23" exitCode=0 Dec 05 12:15:17 crc kubenswrapper[4763]: I1205 12:15:17.266312 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerDied","Data":"563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23"} Dec 05 12:15:17 crc kubenswrapper[4763]: I1205 12:15:17.273664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" event={"ID":"a28052ee-43d2-4618-a981-ef115a2c3a00","Type":"ContainerStarted","Data":"bc63af83d4a5da3763d8adb700a542d245d7333fc33c8ba2b7b5c505a59e572f"} Dec 05 12:15:17 crc kubenswrapper[4763]: I1205 12:15:17.273720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" event={"ID":"a28052ee-43d2-4618-a981-ef115a2c3a00","Type":"ContainerStarted","Data":"3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8"} Dec 05 12:15:17 crc kubenswrapper[4763]: I1205 12:15:17.302725 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" podStartSLOduration=1.8263065219999999 podStartE2EDuration="2.302700895s" podCreationTimestamp="2025-12-05 12:15:15 +0000 UTC" firstStartedPulling="2025-12-05 12:15:16.303965392 +0000 UTC m=+1600.796680135" lastFinishedPulling="2025-12-05 12:15:16.780359785 +0000 UTC m=+1601.273074508" observedRunningTime="2025-12-05 12:15:17.297980603 +0000 UTC m=+1601.790695326" watchObservedRunningTime="2025-12-05 12:15:17.302700895 +0000 UTC m=+1601.795415618" Dec 05 12:15:20 crc kubenswrapper[4763]: I1205 12:15:20.304306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerStarted","Data":"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb"} Dec 05 12:15:20 crc kubenswrapper[4763]: I1205 12:15:20.324172 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8txf" podStartSLOduration=4.31965548 podStartE2EDuration="7.324150893s" podCreationTimestamp="2025-12-05 12:15:13 +0000 UTC" firstStartedPulling="2025-12-05 12:15:15.24640884 +0000 UTC m=+1599.739123573" lastFinishedPulling="2025-12-05 12:15:18.250904263 +0000 UTC m=+1602.743618986" observedRunningTime="2025-12-05 12:15:20.321615801 +0000 UTC m=+1604.814330534" watchObservedRunningTime="2025-12-05 12:15:20.324150893 +0000 UTC m=+1604.816865616" Dec 05 12:15:23 crc kubenswrapper[4763]: I1205 12:15:23.808444 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:23 crc kubenswrapper[4763]: I1205 12:15:23.808748 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:23 crc kubenswrapper[4763]: I1205 12:15:23.872021 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:24 crc kubenswrapper[4763]: I1205 12:15:24.395008 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:24 crc kubenswrapper[4763]: I1205 12:15:24.446335 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.366787 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8txf" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="registry-server" containerID="cri-o://e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb" gracePeriod=2 Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.805450 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.935355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z7n5\" (UniqueName: \"kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5\") pod \"cf4b6edc-23e2-403b-afdb-65badf3ed399\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.935799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content\") pod \"cf4b6edc-23e2-403b-afdb-65badf3ed399\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.935872 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities\") pod \"cf4b6edc-23e2-403b-afdb-65badf3ed399\" (UID: \"cf4b6edc-23e2-403b-afdb-65badf3ed399\") " Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.937577 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities" (OuterVolumeSpecName: "utilities") pod "cf4b6edc-23e2-403b-afdb-65badf3ed399" (UID: "cf4b6edc-23e2-403b-afdb-65badf3ed399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.958255 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5" (OuterVolumeSpecName: "kube-api-access-5z7n5") pod "cf4b6edc-23e2-403b-afdb-65badf3ed399" (UID: "cf4b6edc-23e2-403b-afdb-65badf3ed399"). InnerVolumeSpecName "kube-api-access-5z7n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:15:26 crc kubenswrapper[4763]: I1205 12:15:26.995898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf4b6edc-23e2-403b-afdb-65badf3ed399" (UID: "cf4b6edc-23e2-403b-afdb-65badf3ed399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.038988 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.039037 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf4b6edc-23e2-403b-afdb-65badf3ed399-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.039051 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z7n5\" (UniqueName: \"kubernetes.io/projected/cf4b6edc-23e2-403b-afdb-65badf3ed399-kube-api-access-5z7n5\") on node \"crc\" DevicePath \"\"" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.378697 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerID="e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb" exitCode=0 Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.378746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerDied","Data":"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb"} Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.378802 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8txf" event={"ID":"cf4b6edc-23e2-403b-afdb-65badf3ed399","Type":"ContainerDied","Data":"7bd62ce18f42a2b60bee68fb18a9935272c40af5c4df4d24c0d54a66bd178a5d"} Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.378819 4763 scope.go:117] "RemoveContainer" containerID="e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.378828 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8txf" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.401967 4763 scope.go:117] "RemoveContainer" containerID="563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.418192 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.430342 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8txf"] Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.444803 4763 scope.go:117] "RemoveContainer" containerID="6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.470430 4763 scope.go:117] "RemoveContainer" containerID="e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb" Dec 05 12:15:27 crc kubenswrapper[4763]: E1205 12:15:27.471467 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb\": container with ID starting with e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb not found: ID does not exist" containerID="e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.471500 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb"} err="failed to get container status \"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb\": rpc error: code = NotFound desc = could not find container \"e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb\": container with ID starting with e38b27dcb1fcecb6b7e2a909ebc0fc82d5dc28812f652ae30eba6bcac9a7f9fb not found: ID does not exist" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.471520 4763 scope.go:117] "RemoveContainer" containerID="563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23" Dec 05 12:15:27 crc kubenswrapper[4763]: E1205 12:15:27.472047 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23\": container with ID starting with 563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23 not found: ID does not exist" containerID="563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.472079 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23"} err="failed to get container status \"563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23\": rpc error: code = NotFound desc = could not find container \"563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23\": container with ID starting with 563bcb373d2c8708b30ee66bbf5b9f44a65ac312e9d3091341af24bcb12d6d23 not found: ID does not exist" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.472097 4763 scope.go:117] "RemoveContainer" containerID="6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196" Dec 05 12:15:27 crc kubenswrapper[4763]: E1205 12:15:27.472438 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196\": container with ID starting with 6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196 not found: ID does not exist" containerID="6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.472472 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196"} err="failed to get container status \"6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196\": rpc error: code = NotFound desc = could not find container \"6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196\": container with ID starting with 6c0a7cad637c54eef2495abbcf51610942c5f6ccd8f2f2c0f225b642640f6196 not found: ID does not exist" Dec 05 12:15:27 crc kubenswrapper[4763]: I1205 12:15:27.797120 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" path="/var/lib/kubelet/pods/cf4b6edc-23e2-403b-afdb-65badf3ed399/volumes" Dec 05 12:15:37 crc kubenswrapper[4763]: I1205 12:15:37.544418 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:15:37 crc kubenswrapper[4763]: I1205 12:15:37.545595 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:15:37 crc kubenswrapper[4763]: I1205 12:15:37.545680 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:15:37 crc kubenswrapper[4763]: I1205 12:15:37.547116 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:15:37 crc kubenswrapper[4763]: I1205 12:15:37.547203 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" gracePeriod=600 Dec 05 12:15:37 crc kubenswrapper[4763]: E1205 12:15:37.669323 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:15:38 crc kubenswrapper[4763]: I1205 12:15:38.485154 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" exitCode=0 Dec 05 12:15:38 crc kubenswrapper[4763]: I1205 12:15:38.485232 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf"} Dec 05 12:15:38 crc kubenswrapper[4763]: I1205 12:15:38.485735 4763 scope.go:117] "RemoveContainer" containerID="5c8f5e57fa75e813c5cdc2f19d0235194d315983bfff446fbbe3434d7a817539" Dec 05 12:15:38 crc kubenswrapper[4763]: I1205 12:15:38.486401 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:15:38 crc kubenswrapper[4763]: E1205 12:15:38.486699 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:15:43 crc kubenswrapper[4763]: I1205 12:15:43.485874 4763 scope.go:117] "RemoveContainer" containerID="1cf3dcc349b15d3a63b8082eea8f18ad7b5e6394db73867be341fb3080e88c6c" Dec 05 12:15:43 crc kubenswrapper[4763]: I1205 12:15:43.516436 4763 scope.go:117] "RemoveContainer" containerID="af09e34a4d4036881af595378a01e6869cd5e9c873a6faf16623dcd01918e2bb" Dec 05 12:15:43 crc kubenswrapper[4763]: I1205 12:15:43.584985 4763 scope.go:117] "RemoveContainer" containerID="782737352cff36ff63ed021324bca8ccf3ce1fe162fe80c46e3a48b895be618b" Dec 05 12:15:52 crc kubenswrapper[4763]: I1205 12:15:52.784978 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:15:52 crc kubenswrapper[4763]: E1205 12:15:52.785890 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:16:07 crc kubenswrapper[4763]: I1205 12:16:07.786876 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:16:07 crc kubenswrapper[4763]: E1205 12:16:07.787634 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:16:22 crc kubenswrapper[4763]: I1205 12:16:22.784099 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:16:22 crc kubenswrapper[4763]: E1205 12:16:22.784746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:16:35 crc kubenswrapper[4763]: I1205 12:16:35.784926 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:16:35 crc kubenswrapper[4763]: E1205 12:16:35.785960 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:16:43 crc kubenswrapper[4763]: I1205 12:16:43.720265 4763 scope.go:117] "RemoveContainer" containerID="ac9a0ca04d43a2b4e83535243429f1e5e917c180a3768e368cbb113ac1b89f13" Dec 05 12:16:43 crc kubenswrapper[4763]: I1205 12:16:43.757975 4763 scope.go:117] "RemoveContainer" containerID="28da7ff0bc1ab19482e76d0dcbf645dff76f55c5d6c01a33e0beea80dab912ec" Dec 05 12:16:50 crc kubenswrapper[4763]: I1205 12:16:50.784947 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:16:50 crc kubenswrapper[4763]: E1205 12:16:50.785645 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:17:05 crc kubenswrapper[4763]: I1205 12:17:05.793902 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:17:05 crc kubenswrapper[4763]: E1205 12:17:05.794664 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:17:17 crc kubenswrapper[4763]: I1205 12:17:17.784713 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:17:17 crc kubenswrapper[4763]: E1205 12:17:17.786310 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:17:28 crc kubenswrapper[4763]: I1205 12:17:28.784615 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:17:28 crc kubenswrapper[4763]: E1205 12:17:28.785580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:17:39 crc kubenswrapper[4763]: I1205 12:17:39.783549 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:17:39 crc kubenswrapper[4763]: E1205 12:17:39.784314 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:17:51 crc kubenswrapper[4763]: I1205 12:17:51.784218 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:17:51 crc kubenswrapper[4763]: E1205 12:17:51.785034 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:03 crc kubenswrapper[4763]: I1205 12:18:03.784896 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:18:03 crc kubenswrapper[4763]: E1205 12:18:03.785651 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:14 crc kubenswrapper[4763]: I1205 12:18:14.785343 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:18:14 crc kubenswrapper[4763]: E1205 12:18:14.786287 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:27 crc kubenswrapper[4763]: I1205 12:18:27.784271 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:18:27 crc kubenswrapper[4763]: E1205 12:18:27.785211 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:42 crc kubenswrapper[4763]: I1205 12:18:42.783570 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:18:42 crc kubenswrapper[4763]: E1205 12:18:42.784357 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:47 crc kubenswrapper[4763]: I1205 12:18:47.326907 4763 generic.go:334] "Generic (PLEG): container finished" podID="a28052ee-43d2-4618-a981-ef115a2c3a00" containerID="bc63af83d4a5da3763d8adb700a542d245d7333fc33c8ba2b7b5c505a59e572f" exitCode=0 Dec 05 12:18:47 crc kubenswrapper[4763]: I1205 12:18:47.327091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" event={"ID":"a28052ee-43d2-4618-a981-ef115a2c3a00","Type":"ContainerDied","Data":"bc63af83d4a5da3763d8adb700a542d245d7333fc33c8ba2b7b5c505a59e572f"} Dec 05 12:18:48 crc kubenswrapper[4763]: I1205 12:18:48.849353 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.010458 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key\") pod \"a28052ee-43d2-4618-a981-ef115a2c3a00\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.010586 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory\") pod \"a28052ee-43d2-4618-a981-ef115a2c3a00\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.010711 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle\") pod \"a28052ee-43d2-4618-a981-ef115a2c3a00\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.010880 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4rw7\" (UniqueName: \"kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7\") pod \"a28052ee-43d2-4618-a981-ef115a2c3a00\" (UID: \"a28052ee-43d2-4618-a981-ef115a2c3a00\") " Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.016950 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a28052ee-43d2-4618-a981-ef115a2c3a00" (UID: "a28052ee-43d2-4618-a981-ef115a2c3a00"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.018153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7" (OuterVolumeSpecName: "kube-api-access-g4rw7") pod "a28052ee-43d2-4618-a981-ef115a2c3a00" (UID: "a28052ee-43d2-4618-a981-ef115a2c3a00"). InnerVolumeSpecName "kube-api-access-g4rw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.050931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a28052ee-43d2-4618-a981-ef115a2c3a00" (UID: "a28052ee-43d2-4618-a981-ef115a2c3a00"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.055927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory" (OuterVolumeSpecName: "inventory") pod "a28052ee-43d2-4618-a981-ef115a2c3a00" (UID: "a28052ee-43d2-4618-a981-ef115a2c3a00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.113429 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.113473 4763 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.113490 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4rw7\" (UniqueName: \"kubernetes.io/projected/a28052ee-43d2-4618-a981-ef115a2c3a00-kube-api-access-g4rw7\") on node \"crc\" DevicePath \"\"" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.113502 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a28052ee-43d2-4618-a981-ef115a2c3a00-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.345062 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" event={"ID":"a28052ee-43d2-4618-a981-ef115a2c3a00","Type":"ContainerDied","Data":"3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8"} Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.345109 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.345159 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee2819f4c393c0220ca57ee2b064ddce9dbf716fad58ac48cd307423813aee8" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441186 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh"] Dec 05 12:18:49 crc kubenswrapper[4763]: E1205 12:18:49.441599 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="extract-utilities" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441619 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="extract-utilities" Dec 05 12:18:49 crc kubenswrapper[4763]: E1205 12:18:49.441642 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28052ee-43d2-4618-a981-ef115a2c3a00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441652 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28052ee-43d2-4618-a981-ef115a2c3a00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 12:18:49 crc kubenswrapper[4763]: E1205 12:18:49.441694 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="registry-server" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441703 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="registry-server" Dec 05 12:18:49 crc kubenswrapper[4763]: E1205 12:18:49.441717 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="extract-content" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441725 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="extract-content" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.441983 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4b6edc-23e2-403b-afdb-65badf3ed399" containerName="registry-server" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.442009 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28052ee-43d2-4618-a981-ef115a2c3a00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.442721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.448493 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.448592 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.448752 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.448842 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.619957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.620130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c94n\" (UniqueName: \"kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.620300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.667448 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh"] Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.727394 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c94n\" (UniqueName: \"kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.727888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.728559 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.731717 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.736900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.747612 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c94n\" (UniqueName: \"kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:49 crc kubenswrapper[4763]: I1205 12:18:49.761471 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:18:50 crc kubenswrapper[4763]: I1205 12:18:50.316499 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:18:50 crc kubenswrapper[4763]: I1205 12:18:50.321265 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh"] Dec 05 12:18:50 crc kubenswrapper[4763]: I1205 12:18:50.355926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" event={"ID":"8c7e581d-5684-4557-96f8-5502a00e1da1","Type":"ContainerStarted","Data":"653ac2ae013101ce077f1cfea69bacc39e3d9f3c57b94ea31ccad9c30cb635c1"} Dec 05 12:18:51 crc kubenswrapper[4763]: I1205 12:18:51.366956 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" event={"ID":"8c7e581d-5684-4557-96f8-5502a00e1da1","Type":"ContainerStarted","Data":"646a53a31e697f5ba648b8b0d85626915a325fbf6e9d74693cc6276e939e356b"} Dec 05 12:18:51 crc kubenswrapper[4763]: I1205 12:18:51.393250 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" podStartSLOduration=1.940645454 podStartE2EDuration="2.393230049s" podCreationTimestamp="2025-12-05 12:18:49 +0000 UTC" firstStartedPulling="2025-12-05 12:18:50.316274055 +0000 UTC m=+1814.808988778" lastFinishedPulling="2025-12-05 12:18:50.76885865 +0000 UTC m=+1815.261573373" observedRunningTime="2025-12-05 12:18:51.382648752 +0000 UTC m=+1815.875363475" watchObservedRunningTime="2025-12-05 12:18:51.393230049 +0000 UTC m=+1815.885944772" Dec 05 12:18:53 crc kubenswrapper[4763]: I1205 12:18:53.784642 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:18:53 crc kubenswrapper[4763]: E1205 12:18:53.785126 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.052959 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-j9qhw"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.063062 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-6476-account-create-update-g45vx"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.076862 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vxwjr"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.087530 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-6476-account-create-update-g45vx"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.096533 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-j9qhw"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.106427 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-s9b6d"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.115417 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fc84-account-create-update-7nn47"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.127464 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-phxtr"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.137081 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vxwjr"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.144663 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-phxtr"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.152600 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fc84-account-create-update-7nn47"] Dec 05 12:18:56 crc kubenswrapper[4763]: I1205 12:18:56.163095 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-s9b6d"] Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.030081 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-01df-account-create-update-9485v"] Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.041355 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-37d5-account-create-update-d25r6"] Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.053248 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-01df-account-create-update-9485v"] Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.065373 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-37d5-account-create-update-d25r6"] Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.805355 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad4cf77-6dbb-4cd3-b404-01f3d5752403" path="/var/lib/kubelet/pods/3ad4cf77-6dbb-4cd3-b404-01f3d5752403/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.807347 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74cdf9d1-2d3a-4822-8353-508112d2bf7d" path="/var/lib/kubelet/pods/74cdf9d1-2d3a-4822-8353-508112d2bf7d/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.808906 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752707c1-f306-4d60-bd81-7c77a2df4e4f" path="/var/lib/kubelet/pods/752707c1-f306-4d60-bd81-7c77a2df4e4f/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.810310 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8af298-3905-4251-b35c-77f7a535aafb" path="/var/lib/kubelet/pods/8e8af298-3905-4251-b35c-77f7a535aafb/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.813534 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6e2524-1d54-4e4e-834d-2176cb504743" path="/var/lib/kubelet/pods/ad6e2524-1d54-4e4e-834d-2176cb504743/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.814932 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b89b04-af02-4afb-bdb4-99c97bcfc9e0" path="/var/lib/kubelet/pods/b6b89b04-af02-4afb-bdb4-99c97bcfc9e0/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.815901 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72e8b76-2c58-49de-af50-45474900f16f" path="/var/lib/kubelet/pods/c72e8b76-2c58-49de-af50-45474900f16f/volumes" Dec 05 12:18:57 crc kubenswrapper[4763]: I1205 12:18:57.816555 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09da9a3-6f2b-4b62-8953-acb7ea6a258c" path="/var/lib/kubelet/pods/e09da9a3-6f2b-4b62-8953-acb7ea6a258c/volumes" Dec 05 12:19:04 crc kubenswrapper[4763]: I1205 12:19:04.784496 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:19:04 crc kubenswrapper[4763]: E1205 12:19:04.784996 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:19:17 crc kubenswrapper[4763]: I1205 12:19:17.047163 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9c79-account-create-update-n7z54"] Dec 05 12:19:17 crc kubenswrapper[4763]: I1205 12:19:17.060900 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9c79-account-create-update-n7z54"] Dec 05 12:19:17 crc kubenswrapper[4763]: I1205 12:19:17.784644 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:19:17 crc kubenswrapper[4763]: E1205 12:19:17.784982 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:19:17 crc kubenswrapper[4763]: I1205 12:19:17.795369 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64503e5-6fd1-49fa-b025-7c00f8b245c3" path="/var/lib/kubelet/pods/d64503e5-6fd1-49fa-b025-7c00f8b245c3/volumes" Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.037275 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-m5zs5"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.045847 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7548-account-create-update-2cptw"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.055780 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d754-account-create-update-n6glx"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.065547 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d754-account-create-update-n6glx"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.077820 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fdrsd"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.095870 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7548-account-create-update-2cptw"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.106258 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-m5zs5"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.115677 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fdrsd"] Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.796913 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdb2955-6d03-4169-8765-22f61729881f" path="/var/lib/kubelet/pods/5cdb2955-6d03-4169-8765-22f61729881f/volumes" Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.797458 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fec4184-203d-48c4-bf8a-39529d6d08ce" path="/var/lib/kubelet/pods/5fec4184-203d-48c4-bf8a-39529d6d08ce/volumes" Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.798080 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c2f44d-7371-42a1-b73b-2e68ba45adf4" path="/var/lib/kubelet/pods/74c2f44d-7371-42a1-b73b-2e68ba45adf4/volumes" Dec 05 12:19:21 crc kubenswrapper[4763]: I1205 12:19:21.798935 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e051d182-dc55-4454-95aa-558c2e183c88" path="/var/lib/kubelet/pods/e051d182-dc55-4454-95aa-558c2e183c88/volumes" Dec 05 12:19:26 crc kubenswrapper[4763]: I1205 12:19:26.036137 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-s2dcm"] Dec 05 12:19:26 crc kubenswrapper[4763]: I1205 12:19:26.046019 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-s2dcm"] Dec 05 12:19:27 crc kubenswrapper[4763]: I1205 12:19:27.801229 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b68be3-b684-41b3-9cb0-6ae8f6f998f3" path="/var/lib/kubelet/pods/a7b68be3-b684-41b3-9cb0-6ae8f6f998f3/volumes" Dec 05 12:19:32 crc kubenswrapper[4763]: I1205 12:19:32.785331 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:19:32 crc kubenswrapper[4763]: E1205 12:19:32.786844 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:19:38 crc kubenswrapper[4763]: I1205 12:19:38.034534 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-d8c46"] Dec 05 12:19:38 crc kubenswrapper[4763]: I1205 12:19:38.046673 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-d8c46"] Dec 05 12:19:39 crc kubenswrapper[4763]: I1205 12:19:39.795042 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7464b1d7-23f8-4450-a41e-1208f89c1fe4" path="/var/lib/kubelet/pods/7464b1d7-23f8-4450-a41e-1208f89c1fe4/volumes" Dec 05 12:19:43 crc kubenswrapper[4763]: I1205 12:19:43.912032 4763 scope.go:117] "RemoveContainer" containerID="83b5ac1a78712d269f1d141bf38dc6972407fa66b10e8a8e74a3c544f116da69" Dec 05 12:19:43 crc kubenswrapper[4763]: I1205 12:19:43.943868 4763 scope.go:117] "RemoveContainer" containerID="b37db4a707dd304747aa3e3a87a820f4ca28a2adfc885c6c0c7414daff659a18" Dec 05 12:19:43 crc kubenswrapper[4763]: I1205 12:19:43.997785 4763 scope.go:117] "RemoveContainer" containerID="201149cd29cb763761c5882b79c4b30e8667a3b48f04fb8e287ba4a84cd99a61" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.046539 4763 scope.go:117] "RemoveContainer" containerID="a3e9f709cbd920a47d757c5a4440af53da3bb50269b92d536c578329fb7a8ee1" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.097692 4763 scope.go:117] "RemoveContainer" containerID="27e555347270a2b412989123148265e3d228bff992c120098ef9c73377f0e826" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.157203 4763 scope.go:117] "RemoveContainer" containerID="96ded869b827b21f0d156220c2684fb377363684aa42c30ccd5a432c948389f9" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.193340 4763 scope.go:117] "RemoveContainer" containerID="f289628ba70dec2b7550756b7acf56cedc1caced9552e8da4e7759a2bf2ab5cf" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.218374 4763 scope.go:117] "RemoveContainer" containerID="abb865d962bbe48de053d972465a8cc2d32cc4b9093d72fd8daf4004e00b2abf" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.240153 4763 scope.go:117] "RemoveContainer" containerID="c44db6b55f8f94c4b0660d5143fadba09775e0a72ef44566ee26011bd4287cd8" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.261357 4763 scope.go:117] "RemoveContainer" containerID="3d75b35f9f557f50be0c25d534acfcb5408c8076f7f8ed83462a0d41dcdf6fe9" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.281429 4763 scope.go:117] "RemoveContainer" containerID="cdaa9c5da9e74fcc5fd493a87c06dd6f8a834444310999188ed4381c90c0ae4f" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.309001 4763 scope.go:117] "RemoveContainer" containerID="ba1988e2a85e403e1c48f58e086de951a0e1e20cecfbd24326a7eb4ccb0a4a9c" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.330023 4763 scope.go:117] "RemoveContainer" containerID="a89ad19746161315407ad87fb235afc0f76c7ea32f73853346ad2dd4d21985ae" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.352377 4763 scope.go:117] "RemoveContainer" containerID="21fa3b64b385872573ce93d0868fee7e62e4de13d2252ff5ec4066471e7743a5" Dec 05 12:19:44 crc kubenswrapper[4763]: I1205 12:19:44.373685 4763 scope.go:117] "RemoveContainer" containerID="5efa931585839bf4c48d07944cf9cd309aff2021e9aa21607020150e540929cf" Dec 05 12:19:45 crc kubenswrapper[4763]: I1205 12:19:45.798026 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:19:45 crc kubenswrapper[4763]: E1205 12:19:45.799283 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:19:51 crc kubenswrapper[4763]: I1205 12:19:51.036871 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vh57r"] Dec 05 12:19:51 crc kubenswrapper[4763]: I1205 12:19:51.047928 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vh57r"] Dec 05 12:19:51 crc kubenswrapper[4763]: I1205 12:19:51.819084 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af45fe0-0c3c-4394-82fb-334e1f6e7cb1" path="/var/lib/kubelet/pods/0af45fe0-0c3c-4394-82fb-334e1f6e7cb1/volumes" Dec 05 12:20:00 crc kubenswrapper[4763]: I1205 12:20:00.783890 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:20:00 crc kubenswrapper[4763]: E1205 12:20:00.784847 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:20:13 crc kubenswrapper[4763]: I1205 12:20:13.784200 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:20:13 crc kubenswrapper[4763]: E1205 12:20:13.785188 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:20:15 crc kubenswrapper[4763]: I1205 12:20:15.045777 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-x6zwz"] Dec 05 12:20:15 crc kubenswrapper[4763]: I1205 12:20:15.058363 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-x6zwz"] Dec 05 12:20:15 crc kubenswrapper[4763]: I1205 12:20:15.795162 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274fe292-e3f0-432c-9947-3bca5514f6d9" path="/var/lib/kubelet/pods/274fe292-e3f0-432c-9947-3bca5514f6d9/volumes" Dec 05 12:20:16 crc kubenswrapper[4763]: I1205 12:20:16.031573 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nvqn2"] Dec 05 12:20:16 crc kubenswrapper[4763]: I1205 12:20:16.041454 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nvqn2"] Dec 05 12:20:17 crc kubenswrapper[4763]: I1205 12:20:17.796716 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bf0108-5266-4f52-8803-39a842ddc777" path="/var/lib/kubelet/pods/b2bf0108-5266-4f52-8803-39a842ddc777/volumes" Dec 05 12:20:24 crc kubenswrapper[4763]: I1205 12:20:24.028449 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-wcz88"] Dec 05 12:20:24 crc kubenswrapper[4763]: I1205 12:20:24.040101 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-wcz88"] Dec 05 12:20:25 crc kubenswrapper[4763]: I1205 12:20:25.028805 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tg56v"] Dec 05 12:20:25 crc kubenswrapper[4763]: I1205 12:20:25.038022 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tg56v"] Dec 05 12:20:25 crc kubenswrapper[4763]: I1205 12:20:25.790052 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:20:25 crc kubenswrapper[4763]: E1205 12:20:25.790401 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:20:25 crc kubenswrapper[4763]: I1205 12:20:25.798283 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4046982d-ad27-468f-897a-167692d9ae49" path="/var/lib/kubelet/pods/4046982d-ad27-468f-897a-167692d9ae49/volumes" Dec 05 12:20:25 crc kubenswrapper[4763]: I1205 12:20:25.799444 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d08538d-45d6-4f05-81a6-60ecc26dc593" path="/var/lib/kubelet/pods/5d08538d-45d6-4f05-81a6-60ecc26dc593/volumes" Dec 05 12:20:33 crc kubenswrapper[4763]: I1205 12:20:33.048053 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wgf85"] Dec 05 12:20:33 crc kubenswrapper[4763]: I1205 12:20:33.062799 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wgf85"] Dec 05 12:20:33 crc kubenswrapper[4763]: I1205 12:20:33.805217 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3d2c51-7840-4854-af9e-e0da6c484074" path="/var/lib/kubelet/pods/1f3d2c51-7840-4854-af9e-e0da6c484074/volumes" Dec 05 12:20:36 crc kubenswrapper[4763]: I1205 12:20:36.784271 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:20:36 crc kubenswrapper[4763]: E1205 12:20:36.784737 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:20:42 crc kubenswrapper[4763]: I1205 12:20:42.456154 4763 generic.go:334] "Generic (PLEG): container finished" podID="8c7e581d-5684-4557-96f8-5502a00e1da1" containerID="646a53a31e697f5ba648b8b0d85626915a325fbf6e9d74693cc6276e939e356b" exitCode=0 Dec 05 12:20:42 crc kubenswrapper[4763]: I1205 12:20:42.456800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" event={"ID":"8c7e581d-5684-4557-96f8-5502a00e1da1","Type":"ContainerDied","Data":"646a53a31e697f5ba648b8b0d85626915a325fbf6e9d74693cc6276e939e356b"} Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.864749 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.947583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key\") pod \"8c7e581d-5684-4557-96f8-5502a00e1da1\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.947635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c94n\" (UniqueName: \"kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n\") pod \"8c7e581d-5684-4557-96f8-5502a00e1da1\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.947698 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory\") pod \"8c7e581d-5684-4557-96f8-5502a00e1da1\" (UID: \"8c7e581d-5684-4557-96f8-5502a00e1da1\") " Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.952906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n" (OuterVolumeSpecName: "kube-api-access-7c94n") pod "8c7e581d-5684-4557-96f8-5502a00e1da1" (UID: "8c7e581d-5684-4557-96f8-5502a00e1da1"). InnerVolumeSpecName "kube-api-access-7c94n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.974135 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory" (OuterVolumeSpecName: "inventory") pod "8c7e581d-5684-4557-96f8-5502a00e1da1" (UID: "8c7e581d-5684-4557-96f8-5502a00e1da1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:20:43 crc kubenswrapper[4763]: I1205 12:20:43.974714 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c7e581d-5684-4557-96f8-5502a00e1da1" (UID: "8c7e581d-5684-4557-96f8-5502a00e1da1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.049112 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.049172 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c94n\" (UniqueName: \"kubernetes.io/projected/8c7e581d-5684-4557-96f8-5502a00e1da1-kube-api-access-7c94n\") on node \"crc\" DevicePath \"\"" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.049186 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c7e581d-5684-4557-96f8-5502a00e1da1-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.477980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" event={"ID":"8c7e581d-5684-4557-96f8-5502a00e1da1","Type":"ContainerDied","Data":"653ac2ae013101ce077f1cfea69bacc39e3d9f3c57b94ea31ccad9c30cb635c1"} Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.478066 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="653ac2ae013101ce077f1cfea69bacc39e3d9f3c57b94ea31ccad9c30cb635c1" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.478081 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.589384 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr"] Dec 05 12:20:44 crc kubenswrapper[4763]: E1205 12:20:44.589892 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7e581d-5684-4557-96f8-5502a00e1da1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.589916 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7e581d-5684-4557-96f8-5502a00e1da1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.590197 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7e581d-5684-4557-96f8-5502a00e1da1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.591040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.593521 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.593717 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.594325 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.595201 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.599188 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr"] Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.670029 4763 scope.go:117] "RemoveContainer" containerID="abbd6d796f26bf34ba2f4470ded8d6e07e7ff1c8a25b2cb78e5219f8ae665d17" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.705563 4763 scope.go:117] "RemoveContainer" containerID="ad8e2eed26b275208329fff0ef5c26c0ab920119e69cc3f7221aa6ac9f83c297" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.761207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9w9\" (UniqueName: \"kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.761275 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.761396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.789485 4763 scope.go:117] "RemoveContainer" containerID="d16f6dd1276dd9c47eb7962e5ac06677fb1281f7d620832bb907af2b23032ab4" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.842397 4763 scope.go:117] "RemoveContainer" containerID="7b00072b9f7e2f54d96cbe80dd6024fdf58ccdadb25eedc84fe24bc238c8501e" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.863072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.863241 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9w9\" (UniqueName: \"kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.863292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.867585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.872261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.881323 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9w9\" (UniqueName: \"kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:44 crc kubenswrapper[4763]: I1205 12:20:44.916376 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.037192 4763 scope.go:117] "RemoveContainer" containerID="36ce7c10d382b350c9957a9b4b25836f63146f81b92abf70b2d1054582c8f727" Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.055004 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-cprq9"] Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.062959 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-cprq9"] Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.090041 4763 scope.go:117] "RemoveContainer" containerID="09c82ff3921155c0ca1d533fd930c760c4a5164fa414656cdc21fd21d92026dd" Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.478029 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr"] Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.488670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" event={"ID":"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33","Type":"ContainerStarted","Data":"96acbb9a24515f1f17a585ccc3ab0721727d8c9458d143e1b873cc1692b1c2be"} Dec 05 12:20:45 crc kubenswrapper[4763]: I1205 12:20:45.796960 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d49525-ec1b-4c52-8221-f3f0bb57e574" path="/var/lib/kubelet/pods/10d49525-ec1b-4c52-8221-f3f0bb57e574/volumes" Dec 05 12:20:46 crc kubenswrapper[4763]: I1205 12:20:46.500275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" event={"ID":"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33","Type":"ContainerStarted","Data":"cb12166e0eeee5b0b51e7edc736068b374e80623069de7678f48d678673da607"} Dec 05 12:20:46 crc kubenswrapper[4763]: I1205 12:20:46.520718 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" podStartSLOduration=2.031280384 podStartE2EDuration="2.520699675s" podCreationTimestamp="2025-12-05 12:20:44 +0000 UTC" firstStartedPulling="2025-12-05 12:20:45.477958103 +0000 UTC m=+1929.970672826" lastFinishedPulling="2025-12-05 12:20:45.967377394 +0000 UTC m=+1930.460092117" observedRunningTime="2025-12-05 12:20:46.518913576 +0000 UTC m=+1931.011628329" watchObservedRunningTime="2025-12-05 12:20:46.520699675 +0000 UTC m=+1931.013414398" Dec 05 12:20:49 crc kubenswrapper[4763]: I1205 12:20:49.788333 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:20:50 crc kubenswrapper[4763]: I1205 12:20:50.965136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264"} Dec 05 12:21:18 crc kubenswrapper[4763]: I1205 12:21:18.046457 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-55e0-account-create-update-kmpcw"] Dec 05 12:21:18 crc kubenswrapper[4763]: I1205 12:21:18.057698 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-55e0-account-create-update-kmpcw"] Dec 05 12:21:19 crc kubenswrapper[4763]: I1205 12:21:19.027033 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-j9pcn"] Dec 05 12:21:19 crc kubenswrapper[4763]: I1205 12:21:19.034606 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-j9pcn"] Dec 05 12:21:19 crc kubenswrapper[4763]: I1205 12:21:19.801747 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e08d98c-50ec-4dd9-a454-5cd1c19c4067" path="/var/lib/kubelet/pods/2e08d98c-50ec-4dd9-a454-5cd1c19c4067/volumes" Dec 05 12:21:19 crc kubenswrapper[4763]: I1205 12:21:19.803218 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec4d7da-472f-449f-8ef2-0515e74f614a" path="/var/lib/kubelet/pods/aec4d7da-472f-449f-8ef2-0515e74f614a/volumes" Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.033597 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6qwfq"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.043569 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1309-account-create-update-55ht7"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.053431 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8n8v7"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.063782 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3545-account-create-update-k6q46"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.076138 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6qwfq"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.086840 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1309-account-create-update-55ht7"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.097944 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3545-account-create-update-k6q46"] Dec 05 12:21:20 crc kubenswrapper[4763]: I1205 12:21:20.108494 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8n8v7"] Dec 05 12:21:21 crc kubenswrapper[4763]: I1205 12:21:21.797026 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035ebd9d-2632-4e8c-9912-bf071d4a02e6" path="/var/lib/kubelet/pods/035ebd9d-2632-4e8c-9912-bf071d4a02e6/volumes" Dec 05 12:21:21 crc kubenswrapper[4763]: I1205 12:21:21.798016 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba97fe5-94ae-43d2-b059-524ead71f164" path="/var/lib/kubelet/pods/0ba97fe5-94ae-43d2-b059-524ead71f164/volumes" Dec 05 12:21:21 crc kubenswrapper[4763]: I1205 12:21:21.798653 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8be5bc-66ad-46a2-867d-965dc226273a" path="/var/lib/kubelet/pods/5c8be5bc-66ad-46a2-867d-965dc226273a/volumes" Dec 05 12:21:21 crc kubenswrapper[4763]: I1205 12:21:21.799306 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9820ac-9442-45a8-9407-d2abab068843" path="/var/lib/kubelet/pods/ab9820ac-9442-45a8-9407-d2abab068843/volumes" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.235113 4763 scope.go:117] "RemoveContainer" containerID="aa2d37d003ec398bf7442e39cc4b9c3ef6d25d10bce17106a8f4e87404bdfadd" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.265198 4763 scope.go:117] "RemoveContainer" containerID="9262c554aa59ae3935c97d723ea55580817b2ae355f9017617a1c80b9275dffa" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.329484 4763 scope.go:117] "RemoveContainer" containerID="cf1f7f2890fe946d91809df395e226ff714951c2e65eb1fb669a67743b62a238" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.411876 4763 scope.go:117] "RemoveContainer" containerID="17c8faa85f0b36d302516e57a0aae40135ec8ca871dc1750896db9d4b6734031" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.471922 4763 scope.go:117] "RemoveContainer" containerID="9eb76be718c1b0dadda12f48e28116e1573a11c11d9ec171fe421d6538f25f86" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.528429 4763 scope.go:117] "RemoveContainer" containerID="f4a411ab04d9ba77a8a5e5ea540d662384dde9c8fb0dacec6de6ff7955d78ced" Dec 05 12:21:45 crc kubenswrapper[4763]: I1205 12:21:45.570092 4763 scope.go:117] "RemoveContainer" containerID="18f458293751772312c57f261e262add7679e4dc1d64c7bc01771dc598bcbdf5" Dec 05 12:21:48 crc kubenswrapper[4763]: I1205 12:21:48.043193 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kf95k"] Dec 05 12:21:48 crc kubenswrapper[4763]: I1205 12:21:48.058856 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kf95k"] Dec 05 12:21:49 crc kubenswrapper[4763]: I1205 12:21:49.803893 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e35966-e928-4029-a553-d2624cbf0fd1" path="/var/lib/kubelet/pods/a6e35966-e928-4029-a553-d2624cbf0fd1/volumes" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.332705 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.340211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.346205 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.433920 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.434217 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.434398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2r52\" (UniqueName: \"kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.536702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2r52\" (UniqueName: \"kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.537294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.537593 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.537905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.537964 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.556871 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2r52\" (UniqueName: \"kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52\") pod \"redhat-operators-h4c2m\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:58 crc kubenswrapper[4763]: I1205 12:21:58.670668 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:21:59 crc kubenswrapper[4763]: I1205 12:21:59.151273 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:21:59 crc kubenswrapper[4763]: I1205 12:21:59.609336 4763 generic.go:334] "Generic (PLEG): container finished" podID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerID="6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b" exitCode=0 Dec 05 12:21:59 crc kubenswrapper[4763]: I1205 12:21:59.609396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerDied","Data":"6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b"} Dec 05 12:21:59 crc kubenswrapper[4763]: I1205 12:21:59.609740 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerStarted","Data":"6a907a9da39627b5ece064f4460d5bd909c98a345e465c89be06a94579681903"} Dec 05 12:22:00 crc kubenswrapper[4763]: I1205 12:22:00.622166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerStarted","Data":"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef"} Dec 05 12:22:02 crc kubenswrapper[4763]: I1205 12:22:02.642742 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" containerID="cb12166e0eeee5b0b51e7edc736068b374e80623069de7678f48d678673da607" exitCode=0 Dec 05 12:22:02 crc kubenswrapper[4763]: I1205 12:22:02.642825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" event={"ID":"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33","Type":"ContainerDied","Data":"cb12166e0eeee5b0b51e7edc736068b374e80623069de7678f48d678673da607"} Dec 05 12:22:03 crc kubenswrapper[4763]: I1205 12:22:03.653556 4763 generic.go:334] "Generic (PLEG): container finished" podID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerID="70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef" exitCode=0 Dec 05 12:22:03 crc kubenswrapper[4763]: I1205 12:22:03.653619 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerDied","Data":"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef"} Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.073827 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.154183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory\") pod \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.154633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d9w9\" (UniqueName: \"kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9\") pod \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.154717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key\") pod \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\" (UID: \"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33\") " Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.159536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9" (OuterVolumeSpecName: "kube-api-access-9d9w9") pod "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" (UID: "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33"). InnerVolumeSpecName "kube-api-access-9d9w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.181054 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" (UID: "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.217868 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory" (OuterVolumeSpecName: "inventory") pod "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" (UID: "e3bafcb4-8ef9-4670-8202-f5c61d6d4c33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.269378 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.269415 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d9w9\" (UniqueName: \"kubernetes.io/projected/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-kube-api-access-9d9w9\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.269428 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3bafcb4-8ef9-4670-8202-f5c61d6d4c33-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.666186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerStarted","Data":"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d"} Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.667537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" event={"ID":"e3bafcb4-8ef9-4670-8202-f5c61d6d4c33","Type":"ContainerDied","Data":"96acbb9a24515f1f17a585ccc3ab0721727d8c9458d143e1b873cc1692b1c2be"} Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.667577 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.667587 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96acbb9a24515f1f17a585ccc3ab0721727d8c9458d143e1b873cc1692b1c2be" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.702888 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4c2m" podStartSLOduration=1.831531999 podStartE2EDuration="6.702867207s" podCreationTimestamp="2025-12-05 12:21:58 +0000 UTC" firstStartedPulling="2025-12-05 12:21:59.611378779 +0000 UTC m=+2004.104093502" lastFinishedPulling="2025-12-05 12:22:04.482713977 +0000 UTC m=+2008.975428710" observedRunningTime="2025-12-05 12:22:04.688364944 +0000 UTC m=+2009.181079697" watchObservedRunningTime="2025-12-05 12:22:04.702867207 +0000 UTC m=+2009.195581940" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.753895 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9"] Dec 05 12:22:04 crc kubenswrapper[4763]: E1205 12:22:04.754271 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.754288 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.754479 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bafcb4-8ef9-4670-8202-f5c61d6d4c33" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.755143 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.758302 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.760080 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.760111 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.760265 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.770384 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9"] Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.905694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckjd\" (UniqueName: \"kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.906280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:04 crc kubenswrapper[4763]: I1205 12:22:04.906356 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.007678 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckjd\" (UniqueName: \"kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.007915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.007983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.013964 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.014738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.027829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckjd\" (UniqueName: \"kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.072394 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.599251 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9"] Dec 05 12:22:05 crc kubenswrapper[4763]: W1205 12:22:05.600471 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed4f328_73dd_4e34_91c4_b68898c59d74.slice/crio-84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d WatchSource:0}: Error finding container 84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d: Status 404 returned error can't find the container with id 84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d Dec 05 12:22:05 crc kubenswrapper[4763]: I1205 12:22:05.677028 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" event={"ID":"1ed4f328-73dd-4e34-91c4-b68898c59d74","Type":"ContainerStarted","Data":"84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d"} Dec 05 12:22:06 crc kubenswrapper[4763]: I1205 12:22:06.689670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" event={"ID":"1ed4f328-73dd-4e34-91c4-b68898c59d74","Type":"ContainerStarted","Data":"ecff7f50582e3dfb06a61ca1f3fc8689d0b26b861b0d510edc6990970bfcd09d"} Dec 05 12:22:06 crc kubenswrapper[4763]: I1205 12:22:06.715021 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" podStartSLOduration=2.225201812 podStartE2EDuration="2.714997593s" podCreationTimestamp="2025-12-05 12:22:04 +0000 UTC" firstStartedPulling="2025-12-05 12:22:05.603188152 +0000 UTC m=+2010.095902875" lastFinishedPulling="2025-12-05 12:22:06.092983943 +0000 UTC m=+2010.585698656" observedRunningTime="2025-12-05 12:22:06.705881196 +0000 UTC m=+2011.198595929" watchObservedRunningTime="2025-12-05 12:22:06.714997593 +0000 UTC m=+2011.207712316" Dec 05 12:22:08 crc kubenswrapper[4763]: I1205 12:22:08.670793 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:08 crc kubenswrapper[4763]: I1205 12:22:08.673110 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:09 crc kubenswrapper[4763]: I1205 12:22:09.738172 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h4c2m" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="registry-server" probeResult="failure" output=< Dec 05 12:22:09 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 12:22:09 crc kubenswrapper[4763]: > Dec 05 12:22:12 crc kubenswrapper[4763]: I1205 12:22:12.742039 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ed4f328-73dd-4e34-91c4-b68898c59d74" containerID="ecff7f50582e3dfb06a61ca1f3fc8689d0b26b861b0d510edc6990970bfcd09d" exitCode=0 Dec 05 12:22:12 crc kubenswrapper[4763]: I1205 12:22:12.742113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" event={"ID":"1ed4f328-73dd-4e34-91c4-b68898c59d74","Type":"ContainerDied","Data":"ecff7f50582e3dfb06a61ca1f3fc8689d0b26b861b0d510edc6990970bfcd09d"} Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.055569 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wxg7t"] Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.067293 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wxg7t"] Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.199322 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.214779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckjd\" (UniqueName: \"kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd\") pod \"1ed4f328-73dd-4e34-91c4-b68898c59d74\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.214840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key\") pod \"1ed4f328-73dd-4e34-91c4-b68898c59d74\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.214916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory\") pod \"1ed4f328-73dd-4e34-91c4-b68898c59d74\" (UID: \"1ed4f328-73dd-4e34-91c4-b68898c59d74\") " Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.241609 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd" (OuterVolumeSpecName: "kube-api-access-vckjd") pod "1ed4f328-73dd-4e34-91c4-b68898c59d74" (UID: "1ed4f328-73dd-4e34-91c4-b68898c59d74"). InnerVolumeSpecName "kube-api-access-vckjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.250727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory" (OuterVolumeSpecName: "inventory") pod "1ed4f328-73dd-4e34-91c4-b68898c59d74" (UID: "1ed4f328-73dd-4e34-91c4-b68898c59d74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.267002 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ed4f328-73dd-4e34-91c4-b68898c59d74" (UID: "1ed4f328-73dd-4e34-91c4-b68898c59d74"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.316450 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckjd\" (UniqueName: \"kubernetes.io/projected/1ed4f328-73dd-4e34-91c4-b68898c59d74-kube-api-access-vckjd\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.316481 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.316494 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ed4f328-73dd-4e34-91c4-b68898c59d74-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.761655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" event={"ID":"1ed4f328-73dd-4e34-91c4-b68898c59d74","Type":"ContainerDied","Data":"84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d"} Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.761960 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84d23c2a71b49d5041bc000a7fc3f82bbe201cd828b6bc044f73fe1ae1fe372d" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.761708 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.840352 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd"] Dec 05 12:22:14 crc kubenswrapper[4763]: E1205 12:22:14.841090 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4f328-73dd-4e34-91c4-b68898c59d74" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.841122 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4f328-73dd-4e34-91c4-b68898c59d74" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.841445 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed4f328-73dd-4e34-91c4-b68898c59d74" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.842491 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.844619 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.844839 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.846075 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.849143 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.855542 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd"] Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.925230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjj4\" (UniqueName: \"kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.925286 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:14 crc kubenswrapper[4763]: I1205 12:22:14.925557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.026073 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h96f9"] Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.027708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjj4\" (UniqueName: \"kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.027850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.027996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.033350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.033420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.037275 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h96f9"] Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.045334 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjj4\" (UniqueName: \"kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k85hd\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.158205 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.474272 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd"] Dec 05 12:22:15 crc kubenswrapper[4763]: W1205 12:22:15.483160 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b7e81d2_79b9_4368_bb1f_fa3fda3f5f0b.slice/crio-8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158 WatchSource:0}: Error finding container 8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158: Status 404 returned error can't find the container with id 8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158 Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.771424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" event={"ID":"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b","Type":"ContainerStarted","Data":"8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158"} Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.794977 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5732ce-fb71-4847-a977-763074d671f6" path="/var/lib/kubelet/pods/4f5732ce-fb71-4847-a977-763074d671f6/volumes" Dec 05 12:22:15 crc kubenswrapper[4763]: I1205 12:22:15.795822 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57820ba4-dfb8-40a5-be44-26fc6fe01967" path="/var/lib/kubelet/pods/57820ba4-dfb8-40a5-be44-26fc6fe01967/volumes" Dec 05 12:22:16 crc kubenswrapper[4763]: I1205 12:22:16.786588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" event={"ID":"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b","Type":"ContainerStarted","Data":"b8a76ad8268ae4758ebcec2d46cd7e4e9cd6add446fb379bb379e935a153fc8b"} Dec 05 12:22:16 crc kubenswrapper[4763]: I1205 12:22:16.814457 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" podStartSLOduration=2.414801357 podStartE2EDuration="2.814426366s" podCreationTimestamp="2025-12-05 12:22:14 +0000 UTC" firstStartedPulling="2025-12-05 12:22:15.485750234 +0000 UTC m=+2019.978464967" lastFinishedPulling="2025-12-05 12:22:15.885375253 +0000 UTC m=+2020.378089976" observedRunningTime="2025-12-05 12:22:16.807901679 +0000 UTC m=+2021.300616412" watchObservedRunningTime="2025-12-05 12:22:16.814426366 +0000 UTC m=+2021.307141099" Dec 05 12:22:18 crc kubenswrapper[4763]: I1205 12:22:18.716830 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:18 crc kubenswrapper[4763]: I1205 12:22:18.767544 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:18 crc kubenswrapper[4763]: I1205 12:22:18.950204 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:22:19 crc kubenswrapper[4763]: I1205 12:22:19.810945 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4c2m" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="registry-server" containerID="cri-o://943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d" gracePeriod=2 Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.288878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.338073 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2r52\" (UniqueName: \"kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52\") pod \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.338522 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content\") pod \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.338645 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities\") pod \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\" (UID: \"74d3d7b4-7773-4fb4-9123-e5a75eafc92e\") " Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.340076 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities" (OuterVolumeSpecName: "utilities") pod "74d3d7b4-7773-4fb4-9123-e5a75eafc92e" (UID: "74d3d7b4-7773-4fb4-9123-e5a75eafc92e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.348025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52" (OuterVolumeSpecName: "kube-api-access-v2r52") pod "74d3d7b4-7773-4fb4-9123-e5a75eafc92e" (UID: "74d3d7b4-7773-4fb4-9123-e5a75eafc92e"). InnerVolumeSpecName "kube-api-access-v2r52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.440503 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2r52\" (UniqueName: \"kubernetes.io/projected/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-kube-api-access-v2r52\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.440535 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.458428 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74d3d7b4-7773-4fb4-9123-e5a75eafc92e" (UID: "74d3d7b4-7773-4fb4-9123-e5a75eafc92e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.542030 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d3d7b4-7773-4fb4-9123-e5a75eafc92e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.821547 4763 generic.go:334] "Generic (PLEG): container finished" podID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerID="943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d" exitCode=0 Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.821591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerDied","Data":"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d"} Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.821637 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c2m" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.821649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c2m" event={"ID":"74d3d7b4-7773-4fb4-9123-e5a75eafc92e","Type":"ContainerDied","Data":"6a907a9da39627b5ece064f4460d5bd909c98a345e465c89be06a94579681903"} Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.821676 4763 scope.go:117] "RemoveContainer" containerID="943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.855735 4763 scope.go:117] "RemoveContainer" containerID="70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.857884 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.866988 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4c2m"] Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.875018 4763 scope.go:117] "RemoveContainer" containerID="6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.916584 4763 scope.go:117] "RemoveContainer" containerID="943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d" Dec 05 12:22:20 crc kubenswrapper[4763]: E1205 12:22:20.917130 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d\": container with ID starting with 943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d not found: ID does not exist" containerID="943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.917179 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d"} err="failed to get container status \"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d\": rpc error: code = NotFound desc = could not find container \"943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d\": container with ID starting with 943e915d496bd266a952d6316f9b25eaae120982cd53bf69d51fc3388c30ea2d not found: ID does not exist" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.917209 4763 scope.go:117] "RemoveContainer" containerID="70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef" Dec 05 12:22:20 crc kubenswrapper[4763]: E1205 12:22:20.917538 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef\": container with ID starting with 70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef not found: ID does not exist" containerID="70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.917577 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef"} err="failed to get container status \"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef\": rpc error: code = NotFound desc = could not find container \"70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef\": container with ID starting with 70f683f5cc37b89a404bfb5507714bc11008438a80e42d157f30d7d609d165ef not found: ID does not exist" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.917605 4763 scope.go:117] "RemoveContainer" containerID="6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b" Dec 05 12:22:20 crc kubenswrapper[4763]: E1205 12:22:20.917914 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b\": container with ID starting with 6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b not found: ID does not exist" containerID="6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b" Dec 05 12:22:20 crc kubenswrapper[4763]: I1205 12:22:20.918178 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b"} err="failed to get container status \"6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b\": rpc error: code = NotFound desc = could not find container \"6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b\": container with ID starting with 6a6f01f0704ad3228d07ce2741f4ec735ed698a9121d04e3c1db38ca226c8e1b not found: ID does not exist" Dec 05 12:22:21 crc kubenswrapper[4763]: I1205 12:22:21.794738 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" path="/var/lib/kubelet/pods/74d3d7b4-7773-4fb4-9123-e5a75eafc92e/volumes" Dec 05 12:22:45 crc kubenswrapper[4763]: I1205 12:22:45.713713 4763 scope.go:117] "RemoveContainer" containerID="93c63cdbc34e09e7e9ddd7320caa2e3d56e27ba691af3b7e735782f7c54bbb57" Dec 05 12:22:45 crc kubenswrapper[4763]: I1205 12:22:45.758488 4763 scope.go:117] "RemoveContainer" containerID="8afaf76e12309847b9eb9ad71c6b9943189539680889d714ecd20a188921ccf8" Dec 05 12:22:45 crc kubenswrapper[4763]: I1205 12:22:45.808940 4763 scope.go:117] "RemoveContainer" containerID="cf929ad9c01927e617f8bb295262d53a07116145ca27fc231a08472471745adb" Dec 05 12:22:56 crc kubenswrapper[4763]: I1205 12:22:56.172063 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" containerID="b8a76ad8268ae4758ebcec2d46cd7e4e9cd6add446fb379bb379e935a153fc8b" exitCode=0 Dec 05 12:22:56 crc kubenswrapper[4763]: I1205 12:22:56.172204 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" event={"ID":"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b","Type":"ContainerDied","Data":"b8a76ad8268ae4758ebcec2d46cd7e4e9cd6add446fb379bb379e935a153fc8b"} Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.629152 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.735230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzjj4\" (UniqueName: \"kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4\") pod \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.735730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory\") pod \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.735752 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key\") pod \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\" (UID: \"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b\") " Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.741755 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4" (OuterVolumeSpecName: "kube-api-access-qzjj4") pod "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" (UID: "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b"). InnerVolumeSpecName "kube-api-access-qzjj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.763336 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory" (OuterVolumeSpecName: "inventory") pod "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" (UID: "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.766405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" (UID: "4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.838271 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.838308 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:57 crc kubenswrapper[4763]: I1205 12:22:57.838323 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzjj4\" (UniqueName: \"kubernetes.io/projected/4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b-kube-api-access-qzjj4\") on node \"crc\" DevicePath \"\"" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.197488 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" event={"ID":"4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b","Type":"ContainerDied","Data":"8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158"} Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.197549 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e9cbf918ecb74d9f80ec1aab423320c2c4985d6c7a5923d1235c0cffb7c8158" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.197625 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k85hd" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.319593 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69"] Dec 05 12:22:58 crc kubenswrapper[4763]: E1205 12:22:58.320067 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="registry-server" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320087 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="registry-server" Dec 05 12:22:58 crc kubenswrapper[4763]: E1205 12:22:58.320109 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="extract-content" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320117 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="extract-content" Dec 05 12:22:58 crc kubenswrapper[4763]: E1205 12:22:58.320141 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="extract-utilities" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320149 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="extract-utilities" Dec 05 12:22:58 crc kubenswrapper[4763]: E1205 12:22:58.320163 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320173 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320377 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d3d7b4-7773-4fb4-9123-e5a75eafc92e" containerName="registry-server" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.320403 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.321135 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.324001 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.324426 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.324509 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.329451 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.332318 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69"] Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.454368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.454537 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8frv6\" (UniqueName: \"kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.454642 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.557525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.557599 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8frv6\" (UniqueName: \"kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.557633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.562404 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.565235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.581260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8frv6\" (UniqueName: \"kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gzt69\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:58 crc kubenswrapper[4763]: I1205 12:22:58.640429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:22:59 crc kubenswrapper[4763]: I1205 12:22:59.043540 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-qjz64"] Dec 05 12:22:59 crc kubenswrapper[4763]: I1205 12:22:59.055125 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-qjz64"] Dec 05 12:22:59 crc kubenswrapper[4763]: I1205 12:22:59.168420 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69"] Dec 05 12:22:59 crc kubenswrapper[4763]: W1205 12:22:59.180966 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9169edb0_a8a3_4953_8472_6e496fced2e6.slice/crio-5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0 WatchSource:0}: Error finding container 5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0: Status 404 returned error can't find the container with id 5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0 Dec 05 12:22:59 crc kubenswrapper[4763]: I1205 12:22:59.206909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" event={"ID":"9169edb0-a8a3-4953-8472-6e496fced2e6","Type":"ContainerStarted","Data":"5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0"} Dec 05 12:22:59 crc kubenswrapper[4763]: I1205 12:22:59.795398 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31abbaf4-1fc8-4f73-b549-ec6e262a08d0" path="/var/lib/kubelet/pods/31abbaf4-1fc8-4f73-b549-ec6e262a08d0/volumes" Dec 05 12:23:00 crc kubenswrapper[4763]: I1205 12:23:00.217218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" event={"ID":"9169edb0-a8a3-4953-8472-6e496fced2e6","Type":"ContainerStarted","Data":"991406e58b1c779cc1b7e1e2067f1962f9014058ff401919c155bba13ceb5900"} Dec 05 12:23:00 crc kubenswrapper[4763]: I1205 12:23:00.233304 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" podStartSLOduration=1.589120152 podStartE2EDuration="2.233285912s" podCreationTimestamp="2025-12-05 12:22:58 +0000 UTC" firstStartedPulling="2025-12-05 12:22:59.18384742 +0000 UTC m=+2063.676562143" lastFinishedPulling="2025-12-05 12:22:59.82801318 +0000 UTC m=+2064.320727903" observedRunningTime="2025-12-05 12:23:00.23321624 +0000 UTC m=+2064.725931013" watchObservedRunningTime="2025-12-05 12:23:00.233285912 +0000 UTC m=+2064.726000635" Dec 05 12:23:07 crc kubenswrapper[4763]: I1205 12:23:07.544801 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:23:07 crc kubenswrapper[4763]: I1205 12:23:07.547314 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:23:37 crc kubenswrapper[4763]: I1205 12:23:37.544188 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:23:37 crc kubenswrapper[4763]: I1205 12:23:37.544727 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:23:45 crc kubenswrapper[4763]: I1205 12:23:45.962524 4763 scope.go:117] "RemoveContainer" containerID="f9da4deeab88d6c2e18623166d865cac62c2ed2c875a0585837ca15e1a656541" Dec 05 12:23:51 crc kubenswrapper[4763]: I1205 12:23:51.145172 4763 generic.go:334] "Generic (PLEG): container finished" podID="9169edb0-a8a3-4953-8472-6e496fced2e6" containerID="991406e58b1c779cc1b7e1e2067f1962f9014058ff401919c155bba13ceb5900" exitCode=0 Dec 05 12:23:51 crc kubenswrapper[4763]: I1205 12:23:51.145249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" event={"ID":"9169edb0-a8a3-4953-8472-6e496fced2e6","Type":"ContainerDied","Data":"991406e58b1c779cc1b7e1e2067f1962f9014058ff401919c155bba13ceb5900"} Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.606119 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.808593 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key\") pod \"9169edb0-a8a3-4953-8472-6e496fced2e6\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.808685 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8frv6\" (UniqueName: \"kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6\") pod \"9169edb0-a8a3-4953-8472-6e496fced2e6\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.809036 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory\") pod \"9169edb0-a8a3-4953-8472-6e496fced2e6\" (UID: \"9169edb0-a8a3-4953-8472-6e496fced2e6\") " Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.814823 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6" (OuterVolumeSpecName: "kube-api-access-8frv6") pod "9169edb0-a8a3-4953-8472-6e496fced2e6" (UID: "9169edb0-a8a3-4953-8472-6e496fced2e6"). InnerVolumeSpecName "kube-api-access-8frv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.843270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory" (OuterVolumeSpecName: "inventory") pod "9169edb0-a8a3-4953-8472-6e496fced2e6" (UID: "9169edb0-a8a3-4953-8472-6e496fced2e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.860371 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9169edb0-a8a3-4953-8472-6e496fced2e6" (UID: "9169edb0-a8a3-4953-8472-6e496fced2e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.913155 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.913186 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9169edb0-a8a3-4953-8472-6e496fced2e6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:23:52 crc kubenswrapper[4763]: I1205 12:23:52.913197 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8frv6\" (UniqueName: \"kubernetes.io/projected/9169edb0-a8a3-4953-8472-6e496fced2e6-kube-api-access-8frv6\") on node \"crc\" DevicePath \"\"" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.164056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" event={"ID":"9169edb0-a8a3-4953-8472-6e496fced2e6","Type":"ContainerDied","Data":"5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0"} Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.164104 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b24940b8272758aa761fbe0d6b9e7708ce630ee5d2e996d775d7536687eb8e0" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.164130 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gzt69" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.278268 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gtkd8"] Dec 05 12:23:53 crc kubenswrapper[4763]: E1205 12:23:53.278845 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9169edb0-a8a3-4953-8472-6e496fced2e6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.278870 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9169edb0-a8a3-4953-8472-6e496fced2e6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.279059 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9169edb0-a8a3-4953-8472-6e496fced2e6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.279751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.288436 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gtkd8"] Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.289084 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.289462 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.289803 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.290140 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.422025 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.422114 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6n22\" (UniqueName: \"kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.422176 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.525059 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.525660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6n22\" (UniqueName: \"kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.526028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.530643 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.530925 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.540148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6n22\" (UniqueName: \"kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22\") pod \"ssh-known-hosts-edpm-deployment-gtkd8\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:53 crc kubenswrapper[4763]: I1205 12:23:53.614245 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:23:54 crc kubenswrapper[4763]: I1205 12:23:54.140017 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gtkd8"] Dec 05 12:23:54 crc kubenswrapper[4763]: I1205 12:23:54.140674 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:23:54 crc kubenswrapper[4763]: I1205 12:23:54.174557 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" event={"ID":"23d776eb-9b6f-439e-8938-2aea4708e154","Type":"ContainerStarted","Data":"c6f0a6902af2944bf938ef4078d6a74f6cab884be4d9abf6338372f5f4a6c1fd"} Dec 05 12:23:55 crc kubenswrapper[4763]: I1205 12:23:55.184010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" event={"ID":"23d776eb-9b6f-439e-8938-2aea4708e154","Type":"ContainerStarted","Data":"2197f0c675a0d14fc2a9a716601007de1baa03a997a7b0196a63aca24a9094e5"} Dec 05 12:23:55 crc kubenswrapper[4763]: I1205 12:23:55.203020 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" podStartSLOduration=1.540313695 podStartE2EDuration="2.202993997s" podCreationTimestamp="2025-12-05 12:23:53 +0000 UTC" firstStartedPulling="2025-12-05 12:23:54.14048719 +0000 UTC m=+2118.633201913" lastFinishedPulling="2025-12-05 12:23:54.803167492 +0000 UTC m=+2119.295882215" observedRunningTime="2025-12-05 12:23:55.198419883 +0000 UTC m=+2119.691134606" watchObservedRunningTime="2025-12-05 12:23:55.202993997 +0000 UTC m=+2119.695708720" Dec 05 12:24:02 crc kubenswrapper[4763]: I1205 12:24:02.243051 4763 generic.go:334] "Generic (PLEG): container finished" podID="23d776eb-9b6f-439e-8938-2aea4708e154" containerID="2197f0c675a0d14fc2a9a716601007de1baa03a997a7b0196a63aca24a9094e5" exitCode=0 Dec 05 12:24:02 crc kubenswrapper[4763]: I1205 12:24:02.243579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" event={"ID":"23d776eb-9b6f-439e-8938-2aea4708e154","Type":"ContainerDied","Data":"2197f0c675a0d14fc2a9a716601007de1baa03a997a7b0196a63aca24a9094e5"} Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.760988 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.934175 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0\") pod \"23d776eb-9b6f-439e-8938-2aea4708e154\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.934658 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6n22\" (UniqueName: \"kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22\") pod \"23d776eb-9b6f-439e-8938-2aea4708e154\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.934722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam\") pod \"23d776eb-9b6f-439e-8938-2aea4708e154\" (UID: \"23d776eb-9b6f-439e-8938-2aea4708e154\") " Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.939697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22" (OuterVolumeSpecName: "kube-api-access-s6n22") pod "23d776eb-9b6f-439e-8938-2aea4708e154" (UID: "23d776eb-9b6f-439e-8938-2aea4708e154"). InnerVolumeSpecName "kube-api-access-s6n22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.962393 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23d776eb-9b6f-439e-8938-2aea4708e154" (UID: "23d776eb-9b6f-439e-8938-2aea4708e154"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:03 crc kubenswrapper[4763]: I1205 12:24:03.963634 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "23d776eb-9b6f-439e-8938-2aea4708e154" (UID: "23d776eb-9b6f-439e-8938-2aea4708e154"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.036480 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6n22\" (UniqueName: \"kubernetes.io/projected/23d776eb-9b6f-439e-8938-2aea4708e154-kube-api-access-s6n22\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.036512 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.036523 4763 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23d776eb-9b6f-439e-8938-2aea4708e154-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.261536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" event={"ID":"23d776eb-9b6f-439e-8938-2aea4708e154","Type":"ContainerDied","Data":"c6f0a6902af2944bf938ef4078d6a74f6cab884be4d9abf6338372f5f4a6c1fd"} Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.261578 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f0a6902af2944bf938ef4078d6a74f6cab884be4d9abf6338372f5f4a6c1fd" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.261577 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gtkd8" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.350727 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv"] Dec 05 12:24:04 crc kubenswrapper[4763]: E1205 12:24:04.351855 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d776eb-9b6f-439e-8938-2aea4708e154" containerName="ssh-known-hosts-edpm-deployment" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.351996 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d776eb-9b6f-439e-8938-2aea4708e154" containerName="ssh-known-hosts-edpm-deployment" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.352653 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d776eb-9b6f-439e-8938-2aea4708e154" containerName="ssh-known-hosts-edpm-deployment" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.354049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.359573 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.360364 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.361399 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.362728 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.390510 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv"] Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.552702 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvwj\" (UniqueName: \"kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.552862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.552914 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.654633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvwj\" (UniqueName: \"kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.654724 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.654782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.658556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.663896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.670233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvwj\" (UniqueName: \"kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7r8pv\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:04 crc kubenswrapper[4763]: I1205 12:24:04.689196 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:05 crc kubenswrapper[4763]: I1205 12:24:05.224980 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv"] Dec 05 12:24:05 crc kubenswrapper[4763]: I1205 12:24:05.270909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" event={"ID":"b73d98c2-daca-4632-9c2e-1ab408ec4ac5","Type":"ContainerStarted","Data":"af868272bbd323dff6df627b82dd54a8345ed427ed9049cd4150908bf7b3a83b"} Dec 05 12:24:06 crc kubenswrapper[4763]: I1205 12:24:06.281560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" event={"ID":"b73d98c2-daca-4632-9c2e-1ab408ec4ac5","Type":"ContainerStarted","Data":"eb6abe090fb7a8500ac165b623bf228b7f6cf80787056b770bf9425236828dfe"} Dec 05 12:24:06 crc kubenswrapper[4763]: I1205 12:24:06.306704 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" podStartSLOduration=1.91188117 podStartE2EDuration="2.306677479s" podCreationTimestamp="2025-12-05 12:24:04 +0000 UTC" firstStartedPulling="2025-12-05 12:24:05.236102214 +0000 UTC m=+2129.728816937" lastFinishedPulling="2025-12-05 12:24:05.630898523 +0000 UTC m=+2130.123613246" observedRunningTime="2025-12-05 12:24:06.300607655 +0000 UTC m=+2130.793322408" watchObservedRunningTime="2025-12-05 12:24:06.306677479 +0000 UTC m=+2130.799392222" Dec 05 12:24:07 crc kubenswrapper[4763]: I1205 12:24:07.544225 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:24:07 crc kubenswrapper[4763]: I1205 12:24:07.544485 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:24:07 crc kubenswrapper[4763]: I1205 12:24:07.544525 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:24:07 crc kubenswrapper[4763]: I1205 12:24:07.545235 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:24:07 crc kubenswrapper[4763]: I1205 12:24:07.545282 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264" gracePeriod=600 Dec 05 12:24:08 crc kubenswrapper[4763]: I1205 12:24:08.307276 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264" exitCode=0 Dec 05 12:24:08 crc kubenswrapper[4763]: I1205 12:24:08.307337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264"} Dec 05 12:24:08 crc kubenswrapper[4763]: I1205 12:24:08.308043 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad"} Dec 05 12:24:08 crc kubenswrapper[4763]: I1205 12:24:08.308085 4763 scope.go:117] "RemoveContainer" containerID="94256af3aa58cdf57218e8ce9dc9bf5828d72b63abcb945a060e96b526de29bf" Dec 05 12:24:16 crc kubenswrapper[4763]: I1205 12:24:16.389464 4763 generic.go:334] "Generic (PLEG): container finished" podID="b73d98c2-daca-4632-9c2e-1ab408ec4ac5" containerID="eb6abe090fb7a8500ac165b623bf228b7f6cf80787056b770bf9425236828dfe" exitCode=0 Dec 05 12:24:16 crc kubenswrapper[4763]: I1205 12:24:16.390022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" event={"ID":"b73d98c2-daca-4632-9c2e-1ab408ec4ac5","Type":"ContainerDied","Data":"eb6abe090fb7a8500ac165b623bf228b7f6cf80787056b770bf9425236828dfe"} Dec 05 12:24:17 crc kubenswrapper[4763]: I1205 12:24:17.844152 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.026729 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key\") pod \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.026818 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory\") pod \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.026964 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvwj\" (UniqueName: \"kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj\") pod \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\" (UID: \"b73d98c2-daca-4632-9c2e-1ab408ec4ac5\") " Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.036081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj" (OuterVolumeSpecName: "kube-api-access-gwvwj") pod "b73d98c2-daca-4632-9c2e-1ab408ec4ac5" (UID: "b73d98c2-daca-4632-9c2e-1ab408ec4ac5"). InnerVolumeSpecName "kube-api-access-gwvwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.055690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b73d98c2-daca-4632-9c2e-1ab408ec4ac5" (UID: "b73d98c2-daca-4632-9c2e-1ab408ec4ac5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.057128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory" (OuterVolumeSpecName: "inventory") pod "b73d98c2-daca-4632-9c2e-1ab408ec4ac5" (UID: "b73d98c2-daca-4632-9c2e-1ab408ec4ac5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.129991 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.130031 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.130049 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvwj\" (UniqueName: \"kubernetes.io/projected/b73d98c2-daca-4632-9c2e-1ab408ec4ac5-kube-api-access-gwvwj\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.409021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" event={"ID":"b73d98c2-daca-4632-9c2e-1ab408ec4ac5","Type":"ContainerDied","Data":"af868272bbd323dff6df627b82dd54a8345ed427ed9049cd4150908bf7b3a83b"} Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.409305 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af868272bbd323dff6df627b82dd54a8345ed427ed9049cd4150908bf7b3a83b" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.409255 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7r8pv" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.491591 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4"] Dec 05 12:24:18 crc kubenswrapper[4763]: E1205 12:24:18.492318 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73d98c2-daca-4632-9c2e-1ab408ec4ac5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.492341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73d98c2-daca-4632-9c2e-1ab408ec4ac5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.492824 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73d98c2-daca-4632-9c2e-1ab408ec4ac5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.494505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.497065 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.497194 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.497408 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.499860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4"] Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.503624 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.640337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.641754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5l9z\" (UniqueName: \"kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.641971 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.743675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.743813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5l9z\" (UniqueName: \"kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.743888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.748613 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.749017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.761798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5l9z\" (UniqueName: \"kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:18 crc kubenswrapper[4763]: I1205 12:24:18.812484 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:19 crc kubenswrapper[4763]: I1205 12:24:19.446805 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4"] Dec 05 12:24:20 crc kubenswrapper[4763]: I1205 12:24:20.449551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" event={"ID":"576fa469-1138-4580-b637-66ec5a5e101e","Type":"ContainerStarted","Data":"db46b5f078dcb4612756145feaff4817dde5faf2e9e5bbc4a73d2ef1e469918b"} Dec 05 12:24:20 crc kubenswrapper[4763]: I1205 12:24:20.450975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" event={"ID":"576fa469-1138-4580-b637-66ec5a5e101e","Type":"ContainerStarted","Data":"7589c203c07b3d9d95087527e3f5034e42d191a95a17bdf3f702950674191ff2"} Dec 05 12:24:20 crc kubenswrapper[4763]: I1205 12:24:20.475104 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" podStartSLOduration=2.074450139 podStartE2EDuration="2.475077646s" podCreationTimestamp="2025-12-05 12:24:18 +0000 UTC" firstStartedPulling="2025-12-05 12:24:19.450088799 +0000 UTC m=+2143.942803532" lastFinishedPulling="2025-12-05 12:24:19.850716316 +0000 UTC m=+2144.343431039" observedRunningTime="2025-12-05 12:24:20.463617825 +0000 UTC m=+2144.956332548" watchObservedRunningTime="2025-12-05 12:24:20.475077646 +0000 UTC m=+2144.967792369" Dec 05 12:24:29 crc kubenswrapper[4763]: I1205 12:24:29.539239 4763 generic.go:334] "Generic (PLEG): container finished" podID="576fa469-1138-4580-b637-66ec5a5e101e" containerID="db46b5f078dcb4612756145feaff4817dde5faf2e9e5bbc4a73d2ef1e469918b" exitCode=0 Dec 05 12:24:29 crc kubenswrapper[4763]: I1205 12:24:29.539319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" event={"ID":"576fa469-1138-4580-b637-66ec5a5e101e","Type":"ContainerDied","Data":"db46b5f078dcb4612756145feaff4817dde5faf2e9e5bbc4a73d2ef1e469918b"} Dec 05 12:24:30 crc kubenswrapper[4763]: I1205 12:24:30.991195 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.120876 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory\") pod \"576fa469-1138-4580-b637-66ec5a5e101e\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.121029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5l9z\" (UniqueName: \"kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z\") pod \"576fa469-1138-4580-b637-66ec5a5e101e\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.121064 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key\") pod \"576fa469-1138-4580-b637-66ec5a5e101e\" (UID: \"576fa469-1138-4580-b637-66ec5a5e101e\") " Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.129065 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z" (OuterVolumeSpecName: "kube-api-access-c5l9z") pod "576fa469-1138-4580-b637-66ec5a5e101e" (UID: "576fa469-1138-4580-b637-66ec5a5e101e"). InnerVolumeSpecName "kube-api-access-c5l9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.150590 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "576fa469-1138-4580-b637-66ec5a5e101e" (UID: "576fa469-1138-4580-b637-66ec5a5e101e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.156643 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory" (OuterVolumeSpecName: "inventory") pod "576fa469-1138-4580-b637-66ec5a5e101e" (UID: "576fa469-1138-4580-b637-66ec5a5e101e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.224256 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.224288 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5l9z\" (UniqueName: \"kubernetes.io/projected/576fa469-1138-4580-b637-66ec5a5e101e-kube-api-access-c5l9z\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.224299 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/576fa469-1138-4580-b637-66ec5a5e101e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.560204 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" event={"ID":"576fa469-1138-4580-b637-66ec5a5e101e","Type":"ContainerDied","Data":"7589c203c07b3d9d95087527e3f5034e42d191a95a17bdf3f702950674191ff2"} Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.561034 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7589c203c07b3d9d95087527e3f5034e42d191a95a17bdf3f702950674191ff2" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.560268 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.654948 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9"] Dec 05 12:24:31 crc kubenswrapper[4763]: E1205 12:24:31.655360 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576fa469-1138-4580-b637-66ec5a5e101e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.655379 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="576fa469-1138-4580-b637-66ec5a5e101e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.655591 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="576fa469-1138-4580-b637-66ec5a5e101e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.656332 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659170 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659170 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659381 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659437 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659591 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659623 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.659685 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.660106 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.676446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9"] Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.837421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.837963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4g6\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838848 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.838967 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.839061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.839182 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.839336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.839378 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.839478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.941951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942285 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942373 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942486 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942588 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.942960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4g6\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.943056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.943166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.943298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.943378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.943461 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.947353 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.948355 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.948912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.949092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.949309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.949376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.949970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.950032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.950150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.950679 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.950846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.952228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.952442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.964321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4g6\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:31 crc kubenswrapper[4763]: I1205 12:24:31.979572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:24:32 crc kubenswrapper[4763]: I1205 12:24:32.514329 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9"] Dec 05 12:24:32 crc kubenswrapper[4763]: I1205 12:24:32.574549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" event={"ID":"5d234676-63b0-4c1c-804f-93d938e0ed84","Type":"ContainerStarted","Data":"56f05e3b380f4b8a0f2b5a71b40f4080ca86ee3d7f07b205e6e1f5bedce98151"} Dec 05 12:24:33 crc kubenswrapper[4763]: I1205 12:24:33.584499 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" event={"ID":"5d234676-63b0-4c1c-804f-93d938e0ed84","Type":"ContainerStarted","Data":"7cec6166cc1f835e1c4ed8a22d45cc654cbd1340e3fce873e4a089020d1ebbcc"} Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.031239 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" podStartSLOduration=31.551936003 podStartE2EDuration="32.031216654s" podCreationTimestamp="2025-12-05 12:24:31 +0000 UTC" firstStartedPulling="2025-12-05 12:24:32.532085577 +0000 UTC m=+2157.024800300" lastFinishedPulling="2025-12-05 12:24:33.011366228 +0000 UTC m=+2157.504080951" observedRunningTime="2025-12-05 12:24:33.661191357 +0000 UTC m=+2158.153906090" watchObservedRunningTime="2025-12-05 12:25:03.031216654 +0000 UTC m=+2187.523931377" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.039030 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.045039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.051074 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.153728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.153846 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qgw\" (UniqueName: \"kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.153919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.256661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qgw\" (UniqueName: \"kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.256810 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.256987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.257336 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.257828 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.280003 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qgw\" (UniqueName: \"kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw\") pod \"redhat-marketplace-fjwkg\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.401074 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:03 crc kubenswrapper[4763]: I1205 12:25:03.923963 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:04 crc kubenswrapper[4763]: I1205 12:25:04.906593 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerID="5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185" exitCode=0 Dec 05 12:25:04 crc kubenswrapper[4763]: I1205 12:25:04.906727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerDied","Data":"5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185"} Dec 05 12:25:04 crc kubenswrapper[4763]: I1205 12:25:04.907177 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerStarted","Data":"a36f95305a532569e6081bd2c057536b911773cea39fc706ee04480b2d3fc083"} Dec 05 12:25:06 crc kubenswrapper[4763]: I1205 12:25:06.924703 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerID="a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500" exitCode=0 Dec 05 12:25:06 crc kubenswrapper[4763]: I1205 12:25:06.924796 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerDied","Data":"a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500"} Dec 05 12:25:07 crc kubenswrapper[4763]: I1205 12:25:07.934866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerStarted","Data":"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b"} Dec 05 12:25:07 crc kubenswrapper[4763]: I1205 12:25:07.953357 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fjwkg" podStartSLOduration=2.491544969 podStartE2EDuration="4.953337563s" podCreationTimestamp="2025-12-05 12:25:03 +0000 UTC" firstStartedPulling="2025-12-05 12:25:04.908870863 +0000 UTC m=+2189.401585586" lastFinishedPulling="2025-12-05 12:25:07.370663457 +0000 UTC m=+2191.863378180" observedRunningTime="2025-12-05 12:25:07.951512205 +0000 UTC m=+2192.444226928" watchObservedRunningTime="2025-12-05 12:25:07.953337563 +0000 UTC m=+2192.446052286" Dec 05 12:25:11 crc kubenswrapper[4763]: I1205 12:25:11.973517 4763 generic.go:334] "Generic (PLEG): container finished" podID="5d234676-63b0-4c1c-804f-93d938e0ed84" containerID="7cec6166cc1f835e1c4ed8a22d45cc654cbd1340e3fce873e4a089020d1ebbcc" exitCode=0 Dec 05 12:25:11 crc kubenswrapper[4763]: I1205 12:25:11.973592 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" event={"ID":"5d234676-63b0-4c1c-804f-93d938e0ed84","Type":"ContainerDied","Data":"7cec6166cc1f835e1c4ed8a22d45cc654cbd1340e3fce873e4a089020d1ebbcc"} Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.401423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.401855 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.474585 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.479693 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.586133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.586405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.586543 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.586677 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.586835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw4g6\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.587004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.587185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.587380 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.587988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.588220 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.588378 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.588602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.588943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.589120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle\") pod \"5d234676-63b0-4c1c-804f-93d938e0ed84\" (UID: \"5d234676-63b0-4c1c-804f-93d938e0ed84\") " Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.594440 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.595431 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.595737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.596531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.598376 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.599322 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6" (OuterVolumeSpecName: "kube-api-access-xw4g6") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "kube-api-access-xw4g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.599350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.600176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.600783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.601274 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.602844 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.604513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.627969 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.629318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory" (OuterVolumeSpecName: "inventory") pod "5d234676-63b0-4c1c-804f-93d938e0ed84" (UID: "5d234676-63b0-4c1c-804f-93d938e0ed84"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695012 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695604 4763 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695671 4763 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695726 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695811 4763 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695874 4763 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695927 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw4g6\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-kube-api-access-xw4g6\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.695995 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696055 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696115 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696173 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696232 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696312 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d234676-63b0-4c1c-804f-93d938e0ed84-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:13 crc kubenswrapper[4763]: I1205 12:25:13.696370 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d234676-63b0-4c1c-804f-93d938e0ed84-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.002252 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.002233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9" event={"ID":"5d234676-63b0-4c1c-804f-93d938e0ed84","Type":"ContainerDied","Data":"56f05e3b380f4b8a0f2b5a71b40f4080ca86ee3d7f07b205e6e1f5bedce98151"} Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.002690 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f05e3b380f4b8a0f2b5a71b40f4080ca86ee3d7f07b205e6e1f5bedce98151" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.075631 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.147330 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.161418 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2"] Dec 05 12:25:14 crc kubenswrapper[4763]: E1205 12:25:14.161931 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d234676-63b0-4c1c-804f-93d938e0ed84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.161955 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d234676-63b0-4c1c-804f-93d938e0ed84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.162233 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d234676-63b0-4c1c-804f-93d938e0ed84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.163164 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.169278 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.169474 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.169620 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.170049 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.170072 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.198526 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2"] Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.308174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.308309 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fscnx\" (UniqueName: \"kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.308489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.308528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.308561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.409817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.409856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.409877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.409901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.409959 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscnx\" (UniqueName: \"kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.411622 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.414973 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.415219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.417451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.433719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fscnx\" (UniqueName: \"kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sl4g2\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:14 crc kubenswrapper[4763]: I1205 12:25:14.482885 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:25:15 crc kubenswrapper[4763]: I1205 12:25:15.095885 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2"] Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.025284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" event={"ID":"688c0399-83be-44e3-adc0-4288525a9f4b","Type":"ContainerStarted","Data":"34952514985ae993b1fb4d0048696092b69e35626c8c7db3fdebbc3587ffcb79"} Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.026143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" event={"ID":"688c0399-83be-44e3-adc0-4288525a9f4b","Type":"ContainerStarted","Data":"2e9e0c3c26e602ea16987b79582182e29bd68c2d875c7267aed24f1fa210f7ee"} Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.025378 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fjwkg" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="registry-server" containerID="cri-o://d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b" gracePeriod=2 Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.044780 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" podStartSLOduration=1.5538395 podStartE2EDuration="2.044743615s" podCreationTimestamp="2025-12-05 12:25:14 +0000 UTC" firstStartedPulling="2025-12-05 12:25:15.113906941 +0000 UTC m=+2199.606621664" lastFinishedPulling="2025-12-05 12:25:15.604811056 +0000 UTC m=+2200.097525779" observedRunningTime="2025-12-05 12:25:16.040153004 +0000 UTC m=+2200.532867717" watchObservedRunningTime="2025-12-05 12:25:16.044743615 +0000 UTC m=+2200.537458338" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.604490 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.657727 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6qgw\" (UniqueName: \"kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw\") pod \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.657849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities\") pod \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.658018 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content\") pod \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\" (UID: \"9c028b7e-1aac-489d-9492-36d4ad0f9a37\") " Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.659353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities" (OuterVolumeSpecName: "utilities") pod "9c028b7e-1aac-489d-9492-36d4ad0f9a37" (UID: "9c028b7e-1aac-489d-9492-36d4ad0f9a37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.671255 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw" (OuterVolumeSpecName: "kube-api-access-x6qgw") pod "9c028b7e-1aac-489d-9492-36d4ad0f9a37" (UID: "9c028b7e-1aac-489d-9492-36d4ad0f9a37"). InnerVolumeSpecName "kube-api-access-x6qgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.680124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c028b7e-1aac-489d-9492-36d4ad0f9a37" (UID: "9c028b7e-1aac-489d-9492-36d4ad0f9a37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.763130 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6qgw\" (UniqueName: \"kubernetes.io/projected/9c028b7e-1aac-489d-9492-36d4ad0f9a37-kube-api-access-x6qgw\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.763168 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:16 crc kubenswrapper[4763]: I1205 12:25:16.763179 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c028b7e-1aac-489d-9492-36d4ad0f9a37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.037019 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerID="d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b" exitCode=0 Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.037101 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjwkg" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.037109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerDied","Data":"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b"} Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.037178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjwkg" event={"ID":"9c028b7e-1aac-489d-9492-36d4ad0f9a37","Type":"ContainerDied","Data":"a36f95305a532569e6081bd2c057536b911773cea39fc706ee04480b2d3fc083"} Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.037203 4763 scope.go:117] "RemoveContainer" containerID="d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.062207 4763 scope.go:117] "RemoveContainer" containerID="a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.072333 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.079932 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjwkg"] Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.099906 4763 scope.go:117] "RemoveContainer" containerID="5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.151947 4763 scope.go:117] "RemoveContainer" containerID="d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b" Dec 05 12:25:17 crc kubenswrapper[4763]: E1205 12:25:17.152493 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b\": container with ID starting with d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b not found: ID does not exist" containerID="d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.152551 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b"} err="failed to get container status \"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b\": rpc error: code = NotFound desc = could not find container \"d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b\": container with ID starting with d6402ef7b70649d297ee88b3df850e8c2a5a559948b30f69bff585f9c897294b not found: ID does not exist" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.152592 4763 scope.go:117] "RemoveContainer" containerID="a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500" Dec 05 12:25:17 crc kubenswrapper[4763]: E1205 12:25:17.152861 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500\": container with ID starting with a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500 not found: ID does not exist" containerID="a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.152896 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500"} err="failed to get container status \"a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500\": rpc error: code = NotFound desc = could not find container \"a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500\": container with ID starting with a2fb086622a3acc47f1f3ee0f020b635d9f06d3a036738d60b3eefc47c218500 not found: ID does not exist" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.152916 4763 scope.go:117] "RemoveContainer" containerID="5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185" Dec 05 12:25:17 crc kubenswrapper[4763]: E1205 12:25:17.153403 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185\": container with ID starting with 5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185 not found: ID does not exist" containerID="5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.153444 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185"} err="failed to get container status \"5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185\": rpc error: code = NotFound desc = could not find container \"5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185\": container with ID starting with 5e572ddb75cee5beb94585722eadfeaedef5c6aebe2e3474983dcfa755642185 not found: ID does not exist" Dec 05 12:25:17 crc kubenswrapper[4763]: I1205 12:25:17.811159 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" path="/var/lib/kubelet/pods/9c028b7e-1aac-489d-9492-36d4ad0f9a37/volumes" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.413980 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:43 crc kubenswrapper[4763]: E1205 12:25:43.416021 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="registry-server" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.416106 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="registry-server" Dec 05 12:25:43 crc kubenswrapper[4763]: E1205 12:25:43.416180 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="extract-content" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.416237 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="extract-content" Dec 05 12:25:43 crc kubenswrapper[4763]: E1205 12:25:43.416317 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="extract-utilities" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.416372 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="extract-utilities" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.416624 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c028b7e-1aac-489d-9492-36d4ad0f9a37" containerName="registry-server" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.418154 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.445388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.593534 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.593974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5gv\" (UniqueName: \"kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.594171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.695452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5gv\" (UniqueName: \"kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.695537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.695597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.696202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.696210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.718548 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5gv\" (UniqueName: \"kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv\") pod \"community-operators-64c27\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:43 crc kubenswrapper[4763]: I1205 12:25:43.744430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.055605 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.059913 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.087926 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.218011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.218086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9dc\" (UniqueName: \"kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.218199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.273502 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.321083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.321161 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9dc\" (UniqueName: \"kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.321233 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.321747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.322045 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.346229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerStarted","Data":"c54cbe3717957857e9924452c25777bc55aeb407c6e206c9f16d33000d276347"} Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.352261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9dc\" (UniqueName: \"kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc\") pod \"certified-operators-phzv6\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:44 crc kubenswrapper[4763]: I1205 12:25:44.405828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.040041 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.360971 4763 generic.go:334] "Generic (PLEG): container finished" podID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerID="c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957" exitCode=0 Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.361067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerDied","Data":"c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957"} Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.361570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerStarted","Data":"f0990d4db466284c436184e69f7ec0b914fc62c4c9c5424056fe6e890f372736"} Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.365041 4763 generic.go:334] "Generic (PLEG): container finished" podID="082db68d-4aea-446e-a393-dc9760e4c8da" containerID="3f569e5dffe7105b741ad2bdf178dd963070534e979f9d5bf5a732d50f149c2f" exitCode=0 Dec 05 12:25:45 crc kubenswrapper[4763]: I1205 12:25:45.365097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerDied","Data":"3f569e5dffe7105b741ad2bdf178dd963070534e979f9d5bf5a732d50f149c2f"} Dec 05 12:25:46 crc kubenswrapper[4763]: I1205 12:25:46.381041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerStarted","Data":"ed646aab5ac2de1faf914647b7a34dde7fa4a05a1dbb2fecd5acdbf5829680fb"} Dec 05 12:25:47 crc kubenswrapper[4763]: I1205 12:25:47.398952 4763 generic.go:334] "Generic (PLEG): container finished" podID="082db68d-4aea-446e-a393-dc9760e4c8da" containerID="ed646aab5ac2de1faf914647b7a34dde7fa4a05a1dbb2fecd5acdbf5829680fb" exitCode=0 Dec 05 12:25:47 crc kubenswrapper[4763]: I1205 12:25:47.399070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerDied","Data":"ed646aab5ac2de1faf914647b7a34dde7fa4a05a1dbb2fecd5acdbf5829680fb"} Dec 05 12:25:47 crc kubenswrapper[4763]: I1205 12:25:47.403718 4763 generic.go:334] "Generic (PLEG): container finished" podID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerID="e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0" exitCode=0 Dec 05 12:25:47 crc kubenswrapper[4763]: I1205 12:25:47.403824 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerDied","Data":"e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0"} Dec 05 12:25:50 crc kubenswrapper[4763]: I1205 12:25:50.447084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerStarted","Data":"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649"} Dec 05 12:25:50 crc kubenswrapper[4763]: I1205 12:25:50.450126 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerStarted","Data":"c856c888f4642f3fa6b8f5d212304edee5544f4aed15ae248d69a91d90235b74"} Dec 05 12:25:50 crc kubenswrapper[4763]: I1205 12:25:50.469074 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phzv6" podStartSLOduration=3.7125857 podStartE2EDuration="7.469052279s" podCreationTimestamp="2025-12-05 12:25:43 +0000 UTC" firstStartedPulling="2025-12-05 12:25:45.364969594 +0000 UTC m=+2229.857684317" lastFinishedPulling="2025-12-05 12:25:49.121436173 +0000 UTC m=+2233.614150896" observedRunningTime="2025-12-05 12:25:50.467697214 +0000 UTC m=+2234.960411937" watchObservedRunningTime="2025-12-05 12:25:50.469052279 +0000 UTC m=+2234.961767002" Dec 05 12:25:53 crc kubenswrapper[4763]: I1205 12:25:53.744784 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:53 crc kubenswrapper[4763]: I1205 12:25:53.745345 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:53 crc kubenswrapper[4763]: I1205 12:25:53.842150 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:53 crc kubenswrapper[4763]: I1205 12:25:53.878132 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-64c27" podStartSLOduration=7.131716244 podStartE2EDuration="10.878107368s" podCreationTimestamp="2025-12-05 12:25:43 +0000 UTC" firstStartedPulling="2025-12-05 12:25:45.369215516 +0000 UTC m=+2229.861930239" lastFinishedPulling="2025-12-05 12:25:49.11560663 +0000 UTC m=+2233.608321363" observedRunningTime="2025-12-05 12:25:50.493405768 +0000 UTC m=+2234.986120501" watchObservedRunningTime="2025-12-05 12:25:53.878107368 +0000 UTC m=+2238.370822091" Dec 05 12:25:54 crc kubenswrapper[4763]: I1205 12:25:54.406942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:54 crc kubenswrapper[4763]: I1205 12:25:54.407006 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:54 crc kubenswrapper[4763]: I1205 12:25:54.475166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:54 crc kubenswrapper[4763]: I1205 12:25:54.573121 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:54 crc kubenswrapper[4763]: I1205 12:25:54.574970 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:56 crc kubenswrapper[4763]: I1205 12:25:56.000792 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:56 crc kubenswrapper[4763]: I1205 12:25:56.515059 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phzv6" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="registry-server" containerID="cri-o://1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649" gracePeriod=2 Dec 05 12:25:56 crc kubenswrapper[4763]: I1205 12:25:56.996597 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:56 crc kubenswrapper[4763]: I1205 12:25:56.997547 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-64c27" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="registry-server" containerID="cri-o://c856c888f4642f3fa6b8f5d212304edee5544f4aed15ae248d69a91d90235b74" gracePeriod=2 Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.225987 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.372661 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities\") pod \"0fdbbd02-16e6-41a3-be27-3843646c5a65\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.372874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j9dc\" (UniqueName: \"kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc\") pod \"0fdbbd02-16e6-41a3-be27-3843646c5a65\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.372997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content\") pod \"0fdbbd02-16e6-41a3-be27-3843646c5a65\" (UID: \"0fdbbd02-16e6-41a3-be27-3843646c5a65\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.393616 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities" (OuterVolumeSpecName: "utilities") pod "0fdbbd02-16e6-41a3-be27-3843646c5a65" (UID: "0fdbbd02-16e6-41a3-be27-3843646c5a65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.414161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc" (OuterVolumeSpecName: "kube-api-access-9j9dc") pod "0fdbbd02-16e6-41a3-be27-3843646c5a65" (UID: "0fdbbd02-16e6-41a3-be27-3843646c5a65"). InnerVolumeSpecName "kube-api-access-9j9dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.475889 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j9dc\" (UniqueName: \"kubernetes.io/projected/0fdbbd02-16e6-41a3-be27-3843646c5a65-kube-api-access-9j9dc\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.475918 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.479933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fdbbd02-16e6-41a3-be27-3843646c5a65" (UID: "0fdbbd02-16e6-41a3-be27-3843646c5a65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.551240 4763 generic.go:334] "Generic (PLEG): container finished" podID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerID="1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649" exitCode=0 Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.551418 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phzv6" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.551881 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerDied","Data":"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649"} Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.551981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phzv6" event={"ID":"0fdbbd02-16e6-41a3-be27-3843646c5a65","Type":"ContainerDied","Data":"f0990d4db466284c436184e69f7ec0b914fc62c4c9c5424056fe6e890f372736"} Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.552006 4763 scope.go:117] "RemoveContainer" containerID="1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.567625 4763 generic.go:334] "Generic (PLEG): container finished" podID="082db68d-4aea-446e-a393-dc9760e4c8da" containerID="c856c888f4642f3fa6b8f5d212304edee5544f4aed15ae248d69a91d90235b74" exitCode=0 Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.567683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerDied","Data":"c856c888f4642f3fa6b8f5d212304edee5544f4aed15ae248d69a91d90235b74"} Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.581218 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fdbbd02-16e6-41a3-be27-3843646c5a65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.601430 4763 scope.go:117] "RemoveContainer" containerID="e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.610420 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.621785 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.623181 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phzv6"] Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.629205 4763 scope.go:117] "RemoveContainer" containerID="c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.680712 4763 scope.go:117] "RemoveContainer" containerID="1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649" Dec 05 12:25:57 crc kubenswrapper[4763]: E1205 12:25:57.682923 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649\": container with ID starting with 1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649 not found: ID does not exist" containerID="1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.682975 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649"} err="failed to get container status \"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649\": rpc error: code = NotFound desc = could not find container \"1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649\": container with ID starting with 1d4c96f85975b3cca2e601609724275e3ef7a12d738bee3a50b15d5d10f97649 not found: ID does not exist" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.683006 4763 scope.go:117] "RemoveContainer" containerID="e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0" Dec 05 12:25:57 crc kubenswrapper[4763]: E1205 12:25:57.683674 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0\": container with ID starting with e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0 not found: ID does not exist" containerID="e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.683702 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0"} err="failed to get container status \"e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0\": rpc error: code = NotFound desc = could not find container \"e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0\": container with ID starting with e4fbc51cd54d9f22f21f3e3a49efc3beb62e6af776a1b186bc888060f5ba25b0 not found: ID does not exist" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.683720 4763 scope.go:117] "RemoveContainer" containerID="c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957" Dec 05 12:25:57 crc kubenswrapper[4763]: E1205 12:25:57.684007 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957\": container with ID starting with c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957 not found: ID does not exist" containerID="c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.684029 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957"} err="failed to get container status \"c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957\": rpc error: code = NotFound desc = could not find container \"c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957\": container with ID starting with c67c1f67775f83824f8a5c6026c69d0dc2b11e8be389128f2faa405be7a9d957 not found: ID does not exist" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.784407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities\") pod \"082db68d-4aea-446e-a393-dc9760e4c8da\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.785786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities" (OuterVolumeSpecName: "utilities") pod "082db68d-4aea-446e-a393-dc9760e4c8da" (UID: "082db68d-4aea-446e-a393-dc9760e4c8da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.789184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5gv\" (UniqueName: \"kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv\") pod \"082db68d-4aea-446e-a393-dc9760e4c8da\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.789288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content\") pod \"082db68d-4aea-446e-a393-dc9760e4c8da\" (UID: \"082db68d-4aea-446e-a393-dc9760e4c8da\") " Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.790939 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.792724 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv" (OuterVolumeSpecName: "kube-api-access-8w5gv") pod "082db68d-4aea-446e-a393-dc9760e4c8da" (UID: "082db68d-4aea-446e-a393-dc9760e4c8da"). InnerVolumeSpecName "kube-api-access-8w5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.806281 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" path="/var/lib/kubelet/pods/0fdbbd02-16e6-41a3-be27-3843646c5a65/volumes" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.856034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082db68d-4aea-446e-a393-dc9760e4c8da" (UID: "082db68d-4aea-446e-a393-dc9760e4c8da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.892940 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5gv\" (UniqueName: \"kubernetes.io/projected/082db68d-4aea-446e-a393-dc9760e4c8da-kube-api-access-8w5gv\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:57 crc kubenswrapper[4763]: I1205 12:25:57.892999 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082db68d-4aea-446e-a393-dc9760e4c8da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.584793 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64c27" Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.584750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64c27" event={"ID":"082db68d-4aea-446e-a393-dc9760e4c8da","Type":"ContainerDied","Data":"c54cbe3717957857e9924452c25777bc55aeb407c6e206c9f16d33000d276347"} Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.585669 4763 scope.go:117] "RemoveContainer" containerID="c856c888f4642f3fa6b8f5d212304edee5544f4aed15ae248d69a91d90235b74" Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.615081 4763 scope.go:117] "RemoveContainer" containerID="ed646aab5ac2de1faf914647b7a34dde7fa4a05a1dbb2fecd5acdbf5829680fb" Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.678634 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.688544 4763 scope.go:117] "RemoveContainer" containerID="3f569e5dffe7105b741ad2bdf178dd963070534e979f9d5bf5a732d50f149c2f" Dec 05 12:25:58 crc kubenswrapper[4763]: I1205 12:25:58.690968 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-64c27"] Dec 05 12:25:59 crc kubenswrapper[4763]: I1205 12:25:59.795585 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" path="/var/lib/kubelet/pods/082db68d-4aea-446e-a393-dc9760e4c8da/volumes" Dec 05 12:26:07 crc kubenswrapper[4763]: I1205 12:26:07.544030 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:26:07 crc kubenswrapper[4763]: I1205 12:26:07.544592 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:26:22 crc kubenswrapper[4763]: I1205 12:26:22.833347 4763 generic.go:334] "Generic (PLEG): container finished" podID="688c0399-83be-44e3-adc0-4288525a9f4b" containerID="34952514985ae993b1fb4d0048696092b69e35626c8c7db3fdebbc3587ffcb79" exitCode=0 Dec 05 12:26:22 crc kubenswrapper[4763]: I1205 12:26:22.833394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" event={"ID":"688c0399-83be-44e3-adc0-4288525a9f4b","Type":"ContainerDied","Data":"34952514985ae993b1fb4d0048696092b69e35626c8c7db3fdebbc3587ffcb79"} Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.281011 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.377280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory\") pod \"688c0399-83be-44e3-adc0-4288525a9f4b\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.377345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fscnx\" (UniqueName: \"kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx\") pod \"688c0399-83be-44e3-adc0-4288525a9f4b\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.377364 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key\") pod \"688c0399-83be-44e3-adc0-4288525a9f4b\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.377608 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle\") pod \"688c0399-83be-44e3-adc0-4288525a9f4b\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.377650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0\") pod \"688c0399-83be-44e3-adc0-4288525a9f4b\" (UID: \"688c0399-83be-44e3-adc0-4288525a9f4b\") " Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.383897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "688c0399-83be-44e3-adc0-4288525a9f4b" (UID: "688c0399-83be-44e3-adc0-4288525a9f4b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.384149 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx" (OuterVolumeSpecName: "kube-api-access-fscnx") pod "688c0399-83be-44e3-adc0-4288525a9f4b" (UID: "688c0399-83be-44e3-adc0-4288525a9f4b"). InnerVolumeSpecName "kube-api-access-fscnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.405467 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "688c0399-83be-44e3-adc0-4288525a9f4b" (UID: "688c0399-83be-44e3-adc0-4288525a9f4b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.407564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "688c0399-83be-44e3-adc0-4288525a9f4b" (UID: "688c0399-83be-44e3-adc0-4288525a9f4b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.411677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory" (OuterVolumeSpecName: "inventory") pod "688c0399-83be-44e3-adc0-4288525a9f4b" (UID: "688c0399-83be-44e3-adc0-4288525a9f4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.480075 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.480134 4763 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/688c0399-83be-44e3-adc0-4288525a9f4b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.480148 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.480161 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fscnx\" (UniqueName: \"kubernetes.io/projected/688c0399-83be-44e3-adc0-4288525a9f4b-kube-api-access-fscnx\") on node \"crc\" DevicePath \"\"" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.480174 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688c0399-83be-44e3-adc0-4288525a9f4b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.854940 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" event={"ID":"688c0399-83be-44e3-adc0-4288525a9f4b","Type":"ContainerDied","Data":"2e9e0c3c26e602ea16987b79582182e29bd68c2d875c7267aed24f1fa210f7ee"} Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.855221 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e9e0c3c26e602ea16987b79582182e29bd68c2d875c7267aed24f1fa210f7ee" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.855331 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sl4g2" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.960694 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5"] Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961315 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="extract-content" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="extract-content" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961365 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688c0399-83be-44e3-adc0-4288525a9f4b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961373 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="688c0399-83be-44e3-adc0-4288525a9f4b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961381 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="extract-utilities" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961388 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="extract-utilities" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961411 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="extract-utilities" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961418 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="extract-utilities" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961436 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961444 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961462 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961468 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: E1205 12:26:24.961482 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="extract-content" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961487 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="extract-content" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961685 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="688c0399-83be-44e3-adc0-4288525a9f4b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961702 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fdbbd02-16e6-41a3-be27-3843646c5a65" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.961720 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="082db68d-4aea-446e-a393-dc9760e4c8da" containerName="registry-server" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.962481 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.965951 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.967155 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.967449 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.967747 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.968021 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.971901 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5"] Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.973512 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.989665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.989901 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.989929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.990273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfvh\" (UniqueName: \"kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.990489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:24 crc kubenswrapper[4763]: I1205 12:26:24.990586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093328 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093461 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djfvh\" (UniqueName: \"kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.093554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.098360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.098591 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.098824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.099240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.101210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.114074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfvh\" (UniqueName: \"kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.289507 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.639588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5"] Dec 05 12:26:25 crc kubenswrapper[4763]: I1205 12:26:25.865429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" event={"ID":"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83","Type":"ContainerStarted","Data":"828acb2ee2ed21e9c0efde12cfca0c6d59c53b5c2153943a0778623122de1f91"} Dec 05 12:26:26 crc kubenswrapper[4763]: I1205 12:26:26.878547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" event={"ID":"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83","Type":"ContainerStarted","Data":"411c78a61337daa578ae2c4a2b181875d585a9aad66e116d722887bdc2facf40"} Dec 05 12:26:26 crc kubenswrapper[4763]: I1205 12:26:26.901018 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" podStartSLOduration=2.471461798 podStartE2EDuration="2.901000604s" podCreationTimestamp="2025-12-05 12:26:24 +0000 UTC" firstStartedPulling="2025-12-05 12:26:25.643022772 +0000 UTC m=+2270.135737495" lastFinishedPulling="2025-12-05 12:26:26.072561578 +0000 UTC m=+2270.565276301" observedRunningTime="2025-12-05 12:26:26.899504485 +0000 UTC m=+2271.392219208" watchObservedRunningTime="2025-12-05 12:26:26.901000604 +0000 UTC m=+2271.393715327" Dec 05 12:26:37 crc kubenswrapper[4763]: I1205 12:26:37.544320 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:26:37 crc kubenswrapper[4763]: I1205 12:26:37.544932 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:27:07 crc kubenswrapper[4763]: I1205 12:27:07.543844 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:27:07 crc kubenswrapper[4763]: I1205 12:27:07.544485 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:27:07 crc kubenswrapper[4763]: I1205 12:27:07.544543 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:27:07 crc kubenswrapper[4763]: I1205 12:27:07.545388 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:27:07 crc kubenswrapper[4763]: I1205 12:27:07.545435 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" gracePeriod=600 Dec 05 12:27:07 crc kubenswrapper[4763]: E1205 12:27:07.670483 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:27:08 crc kubenswrapper[4763]: I1205 12:27:08.319505 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" exitCode=0 Dec 05 12:27:08 crc kubenswrapper[4763]: I1205 12:27:08.319588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad"} Dec 05 12:27:08 crc kubenswrapper[4763]: I1205 12:27:08.319922 4763 scope.go:117] "RemoveContainer" containerID="76e250e7fae7bad440d61476e103ba45c9e512b4480887904c80c0da7acc1264" Dec 05 12:27:08 crc kubenswrapper[4763]: I1205 12:27:08.320594 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:27:08 crc kubenswrapper[4763]: E1205 12:27:08.320903 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:27:16 crc kubenswrapper[4763]: I1205 12:27:16.407381 4763 generic.go:334] "Generic (PLEG): container finished" podID="d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" containerID="411c78a61337daa578ae2c4a2b181875d585a9aad66e116d722887bdc2facf40" exitCode=0 Dec 05 12:27:16 crc kubenswrapper[4763]: I1205 12:27:16.407571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" event={"ID":"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83","Type":"ContainerDied","Data":"411c78a61337daa578ae2c4a2b181875d585a9aad66e116d722887bdc2facf40"} Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.894883 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.935618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djfvh\" (UniqueName: \"kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.935675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.935734 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.935939 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.935997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.936123 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory\") pod \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\" (UID: \"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83\") " Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.944359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh" (OuterVolumeSpecName: "kube-api-access-djfvh") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "kube-api-access-djfvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.944795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.965378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.973454 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.975834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory" (OuterVolumeSpecName: "inventory") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:27:17 crc kubenswrapper[4763]: I1205 12:27:17.987281 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" (UID: "d790dbae-6bb4-4b37-b9bd-0ba454c8fa83"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038180 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038267 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038301 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038313 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djfvh\" (UniqueName: \"kubernetes.io/projected/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-kube-api-access-djfvh\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038322 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.038336 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d790dbae-6bb4-4b37-b9bd-0ba454c8fa83-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.429153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" event={"ID":"d790dbae-6bb4-4b37-b9bd-0ba454c8fa83","Type":"ContainerDied","Data":"828acb2ee2ed21e9c0efde12cfca0c6d59c53b5c2153943a0778623122de1f91"} Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.429216 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="828acb2ee2ed21e9c0efde12cfca0c6d59c53b5c2153943a0778623122de1f91" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.429295 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.620689 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4"] Dec 05 12:27:18 crc kubenswrapper[4763]: E1205 12:27:18.621523 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.621538 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.621752 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d790dbae-6bb4-4b37-b9bd-0ba454c8fa83" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.622543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.624575 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.625332 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.625745 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.625812 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.626478 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.633613 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4"] Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.754166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.754448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.754586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.754827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.754875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkbj\" (UniqueName: \"kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.860038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.861829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkbj\" (UniqueName: \"kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.862219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.862608 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.863080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.865519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.865519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.866706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.868058 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.885977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkbj\" (UniqueName: \"kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:18 crc kubenswrapper[4763]: I1205 12:27:18.994399 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:27:19 crc kubenswrapper[4763]: I1205 12:27:19.502924 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4"] Dec 05 12:27:20 crc kubenswrapper[4763]: I1205 12:27:20.452610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" event={"ID":"4355ed47-63c1-47e1-81e6-33d33f89b5a7","Type":"ContainerStarted","Data":"7ccced8bc8e70040de080686ebba67d4ac03be853d15b3a922320d508bacbac5"} Dec 05 12:27:20 crc kubenswrapper[4763]: I1205 12:27:20.454008 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" event={"ID":"4355ed47-63c1-47e1-81e6-33d33f89b5a7","Type":"ContainerStarted","Data":"f435abee47e857ff28b8abb19f639330416d64026f6f179521da2945faf6c246"} Dec 05 12:27:20 crc kubenswrapper[4763]: I1205 12:27:20.488873 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" podStartSLOduration=1.964741245 podStartE2EDuration="2.488849323s" podCreationTimestamp="2025-12-05 12:27:18 +0000 UTC" firstStartedPulling="2025-12-05 12:27:19.512784801 +0000 UTC m=+2324.005499524" lastFinishedPulling="2025-12-05 12:27:20.036892839 +0000 UTC m=+2324.529607602" observedRunningTime="2025-12-05 12:27:20.479098657 +0000 UTC m=+2324.971813460" watchObservedRunningTime="2025-12-05 12:27:20.488849323 +0000 UTC m=+2324.981564046" Dec 05 12:27:21 crc kubenswrapper[4763]: I1205 12:27:21.785224 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:27:21 crc kubenswrapper[4763]: E1205 12:27:21.785512 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:27:32 crc kubenswrapper[4763]: I1205 12:27:32.785458 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:27:32 crc kubenswrapper[4763]: E1205 12:27:32.786919 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:27:44 crc kubenswrapper[4763]: I1205 12:27:44.784329 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:27:44 crc kubenswrapper[4763]: E1205 12:27:44.785339 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:27:56 crc kubenswrapper[4763]: I1205 12:27:56.784176 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:27:56 crc kubenswrapper[4763]: E1205 12:27:56.785447 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:28:09 crc kubenswrapper[4763]: I1205 12:28:09.783897 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:28:09 crc kubenswrapper[4763]: E1205 12:28:09.784857 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:28:23 crc kubenswrapper[4763]: I1205 12:28:23.784027 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:28:23 crc kubenswrapper[4763]: E1205 12:28:23.785440 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:28:37 crc kubenswrapper[4763]: I1205 12:28:37.784045 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:28:37 crc kubenswrapper[4763]: E1205 12:28:37.786894 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:28:50 crc kubenswrapper[4763]: I1205 12:28:50.784500 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:28:50 crc kubenswrapper[4763]: E1205 12:28:50.785244 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:29:04 crc kubenswrapper[4763]: I1205 12:29:04.784210 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:29:04 crc kubenswrapper[4763]: E1205 12:29:04.784960 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:29:19 crc kubenswrapper[4763]: I1205 12:29:19.785326 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:29:19 crc kubenswrapper[4763]: E1205 12:29:19.786645 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:29:33 crc kubenswrapper[4763]: I1205 12:29:33.784308 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:29:33 crc kubenswrapper[4763]: E1205 12:29:33.785718 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:29:44 crc kubenswrapper[4763]: I1205 12:29:44.784233 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:29:44 crc kubenswrapper[4763]: E1205 12:29:44.785239 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:29:55 crc kubenswrapper[4763]: I1205 12:29:55.796908 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:29:55 crc kubenswrapper[4763]: E1205 12:29:55.797788 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.161698 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc"] Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.163815 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.166115 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.166581 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.175809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc"] Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.198573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xnv\" (UniqueName: \"kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.198660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.198720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.299791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xnv\" (UniqueName: \"kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.299853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.299901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.300821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.306495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.323411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xnv\" (UniqueName: \"kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv\") pod \"collect-profiles-29415630-6mnpc\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.505412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:00 crc kubenswrapper[4763]: I1205 12:30:00.960546 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc"] Dec 05 12:30:01 crc kubenswrapper[4763]: I1205 12:30:01.343319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" event={"ID":"15cd560d-1934-43d9-b3dd-5a1f16b0d880","Type":"ContainerStarted","Data":"a1b3aa28bba6b3c5a8a9ad9e11811b7855d8414725dd2a2653fd824cc534e2d4"} Dec 05 12:30:01 crc kubenswrapper[4763]: I1205 12:30:01.343708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" event={"ID":"15cd560d-1934-43d9-b3dd-5a1f16b0d880","Type":"ContainerStarted","Data":"9a4d85d8557028c95ffdc72697c05c326e3909d45537ba78e28f306ce519e724"} Dec 05 12:30:01 crc kubenswrapper[4763]: I1205 12:30:01.366304 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" podStartSLOduration=1.366274631 podStartE2EDuration="1.366274631s" podCreationTimestamp="2025-12-05 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:30:01.361193552 +0000 UTC m=+2485.853908275" watchObservedRunningTime="2025-12-05 12:30:01.366274631 +0000 UTC m=+2485.858989354" Dec 05 12:30:02 crc kubenswrapper[4763]: I1205 12:30:02.352925 4763 generic.go:334] "Generic (PLEG): container finished" podID="15cd560d-1934-43d9-b3dd-5a1f16b0d880" containerID="a1b3aa28bba6b3c5a8a9ad9e11811b7855d8414725dd2a2653fd824cc534e2d4" exitCode=0 Dec 05 12:30:02 crc kubenswrapper[4763]: I1205 12:30:02.353002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" event={"ID":"15cd560d-1934-43d9-b3dd-5a1f16b0d880","Type":"ContainerDied","Data":"a1b3aa28bba6b3c5a8a9ad9e11811b7855d8414725dd2a2653fd824cc534e2d4"} Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.739182 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.868874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5xnv\" (UniqueName: \"kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv\") pod \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.868995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume\") pod \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.869024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume\") pod \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\" (UID: \"15cd560d-1934-43d9-b3dd-5a1f16b0d880\") " Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.869990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume" (OuterVolumeSpecName: "config-volume") pod "15cd560d-1934-43d9-b3dd-5a1f16b0d880" (UID: "15cd560d-1934-43d9-b3dd-5a1f16b0d880"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.885595 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "15cd560d-1934-43d9-b3dd-5a1f16b0d880" (UID: "15cd560d-1934-43d9-b3dd-5a1f16b0d880"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.885967 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv" (OuterVolumeSpecName: "kube-api-access-d5xnv") pod "15cd560d-1934-43d9-b3dd-5a1f16b0d880" (UID: "15cd560d-1934-43d9-b3dd-5a1f16b0d880"). InnerVolumeSpecName "kube-api-access-d5xnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.972501 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5xnv\" (UniqueName: \"kubernetes.io/projected/15cd560d-1934-43d9-b3dd-5a1f16b0d880-kube-api-access-d5xnv\") on node \"crc\" DevicePath \"\"" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.972741 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15cd560d-1934-43d9-b3dd-5a1f16b0d880-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:30:03 crc kubenswrapper[4763]: I1205 12:30:03.972825 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15cd560d-1934-43d9-b3dd-5a1f16b0d880-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:30:04 crc kubenswrapper[4763]: I1205 12:30:04.373807 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" event={"ID":"15cd560d-1934-43d9-b3dd-5a1f16b0d880","Type":"ContainerDied","Data":"9a4d85d8557028c95ffdc72697c05c326e3909d45537ba78e28f306ce519e724"} Dec 05 12:30:04 crc kubenswrapper[4763]: I1205 12:30:04.374247 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4d85d8557028c95ffdc72697c05c326e3909d45537ba78e28f306ce519e724" Dec 05 12:30:04 crc kubenswrapper[4763]: I1205 12:30:04.373858 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc" Dec 05 12:30:04 crc kubenswrapper[4763]: I1205 12:30:04.458286 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p"] Dec 05 12:30:04 crc kubenswrapper[4763]: I1205 12:30:04.466789 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415585-hp78p"] Dec 05 12:30:05 crc kubenswrapper[4763]: I1205 12:30:05.800468 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda8546a-e13c-4450-9faa-a0e0fcacbfa1" path="/var/lib/kubelet/pods/fda8546a-e13c-4450-9faa-a0e0fcacbfa1/volumes" Dec 05 12:30:06 crc kubenswrapper[4763]: I1205 12:30:06.784461 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:30:06 crc kubenswrapper[4763]: E1205 12:30:06.785195 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:30:21 crc kubenswrapper[4763]: I1205 12:30:21.785339 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:30:21 crc kubenswrapper[4763]: E1205 12:30:21.786722 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:30:34 crc kubenswrapper[4763]: I1205 12:30:34.784567 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:30:34 crc kubenswrapper[4763]: E1205 12:30:34.785319 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:30:46 crc kubenswrapper[4763]: I1205 12:30:46.245594 4763 scope.go:117] "RemoveContainer" containerID="a118ed0666cfac5865b8cf13e2353fb2f762383d409f06f2199efc6c24bae8bf" Dec 05 12:30:47 crc kubenswrapper[4763]: I1205 12:30:47.785103 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:30:47 crc kubenswrapper[4763]: E1205 12:30:47.786003 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:31:02 crc kubenswrapper[4763]: I1205 12:31:02.786197 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:31:02 crc kubenswrapper[4763]: E1205 12:31:02.787253 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:31:16 crc kubenswrapper[4763]: I1205 12:31:16.784665 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:31:16 crc kubenswrapper[4763]: E1205 12:31:16.785744 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:31:30 crc kubenswrapper[4763]: I1205 12:31:30.785120 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:31:30 crc kubenswrapper[4763]: E1205 12:31:30.786317 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:31:41 crc kubenswrapper[4763]: I1205 12:31:41.784207 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:31:41 crc kubenswrapper[4763]: E1205 12:31:41.785145 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:31:45 crc kubenswrapper[4763]: I1205 12:31:45.392084 4763 generic.go:334] "Generic (PLEG): container finished" podID="4355ed47-63c1-47e1-81e6-33d33f89b5a7" containerID="7ccced8bc8e70040de080686ebba67d4ac03be853d15b3a922320d508bacbac5" exitCode=0 Dec 05 12:31:45 crc kubenswrapper[4763]: I1205 12:31:45.392205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" event={"ID":"4355ed47-63c1-47e1-81e6-33d33f89b5a7","Type":"ContainerDied","Data":"7ccced8bc8e70040de080686ebba67d4ac03be853d15b3a922320d508bacbac5"} Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.844516 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.946886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory\") pod \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.947133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle\") pod \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.947200 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0\") pod \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.947272 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkbj\" (UniqueName: \"kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj\") pod \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.947305 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key\") pod \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\" (UID: \"4355ed47-63c1-47e1-81e6-33d33f89b5a7\") " Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.955192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj" (OuterVolumeSpecName: "kube-api-access-mtkbj") pod "4355ed47-63c1-47e1-81e6-33d33f89b5a7" (UID: "4355ed47-63c1-47e1-81e6-33d33f89b5a7"). InnerVolumeSpecName "kube-api-access-mtkbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.955787 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4355ed47-63c1-47e1-81e6-33d33f89b5a7" (UID: "4355ed47-63c1-47e1-81e6-33d33f89b5a7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.979253 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4355ed47-63c1-47e1-81e6-33d33f89b5a7" (UID: "4355ed47-63c1-47e1-81e6-33d33f89b5a7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.979688 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4355ed47-63c1-47e1-81e6-33d33f89b5a7" (UID: "4355ed47-63c1-47e1-81e6-33d33f89b5a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:31:46 crc kubenswrapper[4763]: I1205 12:31:46.980834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory" (OuterVolumeSpecName: "inventory") pod "4355ed47-63c1-47e1-81e6-33d33f89b5a7" (UID: "4355ed47-63c1-47e1-81e6-33d33f89b5a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.049854 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.049899 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.049914 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.049930 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkbj\" (UniqueName: \"kubernetes.io/projected/4355ed47-63c1-47e1-81e6-33d33f89b5a7-kube-api-access-mtkbj\") on node \"crc\" DevicePath \"\"" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.049941 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4355ed47-63c1-47e1-81e6-33d33f89b5a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.418045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" event={"ID":"4355ed47-63c1-47e1-81e6-33d33f89b5a7","Type":"ContainerDied","Data":"f435abee47e857ff28b8abb19f639330416d64026f6f179521da2945faf6c246"} Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.418089 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f435abee47e857ff28b8abb19f639330416d64026f6f179521da2945faf6c246" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.418156 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.510837 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd"] Dec 05 12:31:47 crc kubenswrapper[4763]: E1205 12:31:47.513957 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4355ed47-63c1-47e1-81e6-33d33f89b5a7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.513989 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4355ed47-63c1-47e1-81e6-33d33f89b5a7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 12:31:47 crc kubenswrapper[4763]: E1205 12:31:47.514028 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cd560d-1934-43d9-b3dd-5a1f16b0d880" containerName="collect-profiles" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.514035 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cd560d-1934-43d9-b3dd-5a1f16b0d880" containerName="collect-profiles" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.514318 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4355ed47-63c1-47e1-81e6-33d33f89b5a7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.514340 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cd560d-1934-43d9-b3dd-5a1f16b0d880" containerName="collect-profiles" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.515103 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.517628 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.517890 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.518024 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.518154 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.518228 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.518339 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.518396 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.529272 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd"] Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.559718 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.559811 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.559886 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt799\" (UniqueName: \"kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560007 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560157 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.560283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt799\" (UniqueName: \"kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.661998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.662040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.662064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.662935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.667845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.668019 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.668377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.668516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.669112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.669427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.670220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.679178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt799\" (UniqueName: \"kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mrrcd\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:47 crc kubenswrapper[4763]: I1205 12:31:47.832941 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:31:48 crc kubenswrapper[4763]: I1205 12:31:48.426306 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd"] Dec 05 12:31:48 crc kubenswrapper[4763]: I1205 12:31:48.439352 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:31:49 crc kubenswrapper[4763]: I1205 12:31:49.452503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" event={"ID":"e3335529-4636-46d2-b949-1d02a4c43ee0","Type":"ContainerStarted","Data":"d99cf424f2d8977c01a1866c64d7098fdf8748b1f8d722add3af2eb525484be9"} Dec 05 12:31:49 crc kubenswrapper[4763]: I1205 12:31:49.453011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" event={"ID":"e3335529-4636-46d2-b949-1d02a4c43ee0","Type":"ContainerStarted","Data":"151d526404e168dbb025cdfd3d7c1a344c6bc42eff2723ffbb4c7d6c572be47c"} Dec 05 12:31:49 crc kubenswrapper[4763]: I1205 12:31:49.487658 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" podStartSLOduration=2.024678454 podStartE2EDuration="2.487635898s" podCreationTimestamp="2025-12-05 12:31:47 +0000 UTC" firstStartedPulling="2025-12-05 12:31:48.439045504 +0000 UTC m=+2592.931760247" lastFinishedPulling="2025-12-05 12:31:48.902002968 +0000 UTC m=+2593.394717691" observedRunningTime="2025-12-05 12:31:49.476286159 +0000 UTC m=+2593.969000902" watchObservedRunningTime="2025-12-05 12:31:49.487635898 +0000 UTC m=+2593.980350631" Dec 05 12:31:56 crc kubenswrapper[4763]: I1205 12:31:56.783967 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:31:56 crc kubenswrapper[4763]: E1205 12:31:56.785284 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:32:07 crc kubenswrapper[4763]: I1205 12:32:07.784298 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:32:08 crc kubenswrapper[4763]: I1205 12:32:08.641829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73"} Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.517396 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.532159 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.532262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.608560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.608673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.608793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wm9\" (UniqueName: \"kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.710590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wm9\" (UniqueName: \"kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.711312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.711356 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.711951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.712102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.732050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wm9\" (UniqueName: \"kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9\") pod \"redhat-operators-m6nwd\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:16 crc kubenswrapper[4763]: I1205 12:32:16.866336 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:17 crc kubenswrapper[4763]: I1205 12:32:17.364084 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:17 crc kubenswrapper[4763]: I1205 12:32:17.727294 4763 generic.go:334] "Generic (PLEG): container finished" podID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerID="89b7ae2bd26da60da9788861e1faeebe9bce92091ebc8547a6d3bd80d04b3354" exitCode=0 Dec 05 12:32:17 crc kubenswrapper[4763]: I1205 12:32:17.727355 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerDied","Data":"89b7ae2bd26da60da9788861e1faeebe9bce92091ebc8547a6d3bd80d04b3354"} Dec 05 12:32:17 crc kubenswrapper[4763]: I1205 12:32:17.727585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerStarted","Data":"b23325b952a595bf1a6ee86b16ecf6d0e2ffb0e150a5e8d87f449914cd47390d"} Dec 05 12:32:19 crc kubenswrapper[4763]: I1205 12:32:19.756945 4763 generic.go:334] "Generic (PLEG): container finished" podID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerID="deea709002b0e9b6cf5e7f7459565bed130b1c7caf6a89cebad7ebeb4b27120f" exitCode=0 Dec 05 12:32:19 crc kubenswrapper[4763]: I1205 12:32:19.757074 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerDied","Data":"deea709002b0e9b6cf5e7f7459565bed130b1c7caf6a89cebad7ebeb4b27120f"} Dec 05 12:32:20 crc kubenswrapper[4763]: I1205 12:32:20.768402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerStarted","Data":"5f41dbef07318086fbbc767b8d4c865160a2e7b64a09a1cea2caded14c0c3854"} Dec 05 12:32:20 crc kubenswrapper[4763]: I1205 12:32:20.790831 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6nwd" podStartSLOduration=2.382587829 podStartE2EDuration="4.790807024s" podCreationTimestamp="2025-12-05 12:32:16 +0000 UTC" firstStartedPulling="2025-12-05 12:32:17.730899344 +0000 UTC m=+2622.223614077" lastFinishedPulling="2025-12-05 12:32:20.139118549 +0000 UTC m=+2624.631833272" observedRunningTime="2025-12-05 12:32:20.789114508 +0000 UTC m=+2625.281829231" watchObservedRunningTime="2025-12-05 12:32:20.790807024 +0000 UTC m=+2625.283521757" Dec 05 12:32:26 crc kubenswrapper[4763]: I1205 12:32:26.866902 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:26 crc kubenswrapper[4763]: I1205 12:32:26.867163 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:26 crc kubenswrapper[4763]: I1205 12:32:26.953490 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:27 crc kubenswrapper[4763]: I1205 12:32:27.913063 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:27 crc kubenswrapper[4763]: I1205 12:32:27.967059 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:29 crc kubenswrapper[4763]: I1205 12:32:29.881706 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m6nwd" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="registry-server" containerID="cri-o://5f41dbef07318086fbbc767b8d4c865160a2e7b64a09a1cea2caded14c0c3854" gracePeriod=2 Dec 05 12:32:31 crc kubenswrapper[4763]: I1205 12:32:31.904539 4763 generic.go:334] "Generic (PLEG): container finished" podID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerID="5f41dbef07318086fbbc767b8d4c865160a2e7b64a09a1cea2caded14c0c3854" exitCode=0 Dec 05 12:32:31 crc kubenswrapper[4763]: I1205 12:32:31.904644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerDied","Data":"5f41dbef07318086fbbc767b8d4c865160a2e7b64a09a1cea2caded14c0c3854"} Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.314173 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.448219 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities\") pod \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.448301 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wm9\" (UniqueName: \"kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9\") pod \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.448407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content\") pod \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\" (UID: \"6821cc3a-be0a-4973-98ae-3a8df44e6c26\") " Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.450648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities" (OuterVolumeSpecName: "utilities") pod "6821cc3a-be0a-4973-98ae-3a8df44e6c26" (UID: "6821cc3a-be0a-4973-98ae-3a8df44e6c26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.455836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9" (OuterVolumeSpecName: "kube-api-access-g8wm9") pod "6821cc3a-be0a-4973-98ae-3a8df44e6c26" (UID: "6821cc3a-be0a-4973-98ae-3a8df44e6c26"). InnerVolumeSpecName "kube-api-access-g8wm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.551279 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.551327 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wm9\" (UniqueName: \"kubernetes.io/projected/6821cc3a-be0a-4973-98ae-3a8df44e6c26-kube-api-access-g8wm9\") on node \"crc\" DevicePath \"\"" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.569311 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6821cc3a-be0a-4973-98ae-3a8df44e6c26" (UID: "6821cc3a-be0a-4973-98ae-3a8df44e6c26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.654542 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6821cc3a-be0a-4973-98ae-3a8df44e6c26-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.918230 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nwd" event={"ID":"6821cc3a-be0a-4973-98ae-3a8df44e6c26","Type":"ContainerDied","Data":"b23325b952a595bf1a6ee86b16ecf6d0e2ffb0e150a5e8d87f449914cd47390d"} Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.918323 4763 scope.go:117] "RemoveContainer" containerID="5f41dbef07318086fbbc767b8d4c865160a2e7b64a09a1cea2caded14c0c3854" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.920208 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nwd" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.946479 4763 scope.go:117] "RemoveContainer" containerID="deea709002b0e9b6cf5e7f7459565bed130b1c7caf6a89cebad7ebeb4b27120f" Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.962571 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.970181 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m6nwd"] Dec 05 12:32:32 crc kubenswrapper[4763]: I1205 12:32:32.979893 4763 scope.go:117] "RemoveContainer" containerID="89b7ae2bd26da60da9788861e1faeebe9bce92091ebc8547a6d3bd80d04b3354" Dec 05 12:32:33 crc kubenswrapper[4763]: I1205 12:32:33.800029 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" path="/var/lib/kubelet/pods/6821cc3a-be0a-4973-98ae-3a8df44e6c26/volumes" Dec 05 12:34:17 crc kubenswrapper[4763]: I1205 12:34:17.957377 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-b55c974d9-brgnw" podUID="6d6c980e-688d-41b3-a7ad-0061b07b9494" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 12:34:37 crc kubenswrapper[4763]: I1205 12:34:37.544332 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:34:37 crc kubenswrapper[4763]: I1205 12:34:37.545021 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:34:46 crc kubenswrapper[4763]: I1205 12:34:46.349570 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3335529-4636-46d2-b949-1d02a4c43ee0" containerID="d99cf424f2d8977c01a1866c64d7098fdf8748b1f8d722add3af2eb525484be9" exitCode=0 Dec 05 12:34:46 crc kubenswrapper[4763]: I1205 12:34:46.349645 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" event={"ID":"e3335529-4636-46d2-b949-1d02a4c43ee0","Type":"ContainerDied","Data":"d99cf424f2d8977c01a1866c64d7098fdf8748b1f8d722add3af2eb525484be9"} Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.866734 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.959244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.959807 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.959921 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960003 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt799\" (UniqueName: \"kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960328 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960381 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.960484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key\") pod \"e3335529-4636-46d2-b949-1d02a4c43ee0\" (UID: \"e3335529-4636-46d2-b949-1d02a4c43ee0\") " Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.967935 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.969784 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799" (OuterVolumeSpecName: "kube-api-access-mt799") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "kube-api-access-mt799". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:34:47 crc kubenswrapper[4763]: I1205 12:34:47.995913 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.001620 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.006538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.006726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.009752 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.010999 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory" (OuterVolumeSpecName: "inventory") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.044895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e3335529-4636-46d2-b949-1d02a4c43ee0" (UID: "e3335529-4636-46d2-b949-1d02a4c43ee0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063132 4763 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063195 4763 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063209 4763 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063220 4763 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063231 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt799\" (UniqueName: \"kubernetes.io/projected/e3335529-4636-46d2-b949-1d02a4c43ee0-kube-api-access-mt799\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063242 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063256 4763 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063270 4763 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.063281 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3335529-4636-46d2-b949-1d02a4c43ee0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.368916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" event={"ID":"e3335529-4636-46d2-b949-1d02a4c43ee0","Type":"ContainerDied","Data":"151d526404e168dbb025cdfd3d7c1a344c6bc42eff2723ffbb4c7d6c572be47c"} Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.368969 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="151d526404e168dbb025cdfd3d7c1a344c6bc42eff2723ffbb4c7d6c572be47c" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.368981 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mrrcd" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526131 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm"] Dec 05 12:34:48 crc kubenswrapper[4763]: E1205 12:34:48.526538 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="extract-utilities" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526557 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="extract-utilities" Dec 05 12:34:48 crc kubenswrapper[4763]: E1205 12:34:48.526570 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="registry-server" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526577 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="registry-server" Dec 05 12:34:48 crc kubenswrapper[4763]: E1205 12:34:48.526614 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3335529-4636-46d2-b949-1d02a4c43ee0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526622 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3335529-4636-46d2-b949-1d02a4c43ee0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 12:34:48 crc kubenswrapper[4763]: E1205 12:34:48.526638 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="extract-content" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526644 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="extract-content" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526854 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6821cc3a-be0a-4973-98ae-3a8df44e6c26" containerName="registry-server" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.526889 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3335529-4636-46d2-b949-1d02a4c43ee0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.527559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.530704 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-s7px5" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.531591 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.531605 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.532042 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.534325 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.546252 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm"] Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.574118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.574263 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.574440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.574858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.574913 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmwtb\" (UniqueName: \"kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.575093 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.575183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676775 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmwtb\" (UniqueName: \"kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.676977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.682085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.685839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.689404 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.692190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.695121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.699276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:48 crc kubenswrapper[4763]: I1205 12:34:48.702916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmwtb\" (UniqueName: \"kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:49 crc kubenswrapper[4763]: I1205 12:34:49.072571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:34:49 crc kubenswrapper[4763]: I1205 12:34:49.619621 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm"] Dec 05 12:34:50 crc kubenswrapper[4763]: I1205 12:34:50.389635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" event={"ID":"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9","Type":"ContainerStarted","Data":"4f0112ba882282be8ec2f318df25f8124928d8d7e0533cadfdce491f18079f6e"} Dec 05 12:34:51 crc kubenswrapper[4763]: I1205 12:34:51.402002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" event={"ID":"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9","Type":"ContainerStarted","Data":"2cc13987f41329d0a06183bf9120483f2880d42ef2568b1217deb0099ae757e1"} Dec 05 12:34:51 crc kubenswrapper[4763]: I1205 12:34:51.426736 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" podStartSLOduration=2.940446983 podStartE2EDuration="3.426715412s" podCreationTimestamp="2025-12-05 12:34:48 +0000 UTC" firstStartedPulling="2025-12-05 12:34:49.626693682 +0000 UTC m=+2774.119408405" lastFinishedPulling="2025-12-05 12:34:50.112962111 +0000 UTC m=+2774.605676834" observedRunningTime="2025-12-05 12:34:51.420563945 +0000 UTC m=+2775.913278668" watchObservedRunningTime="2025-12-05 12:34:51.426715412 +0000 UTC m=+2775.919430125" Dec 05 12:35:07 crc kubenswrapper[4763]: I1205 12:35:07.544572 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:35:07 crc kubenswrapper[4763]: I1205 12:35:07.545572 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.416716 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.423232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.442590 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.586409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nclp\" (UniqueName: \"kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.586466 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.586578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.688809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.688966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nclp\" (UniqueName: \"kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.688993 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.689954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.689960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.716474 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nclp\" (UniqueName: \"kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp\") pod \"redhat-marketplace-cxkph\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:36 crc kubenswrapper[4763]: I1205 12:35:36.751274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.353236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.544910 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.545327 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.545380 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.546218 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.546267 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73" gracePeriod=600 Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.861490 4763 generic.go:334] "Generic (PLEG): container finished" podID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerID="a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0" exitCode=0 Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.861552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerDied","Data":"a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0"} Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.861910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerStarted","Data":"b5ba64670a947d95bf424aad61ecb493cdb3348a88ec760aaefd1e79747dce5b"} Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.866288 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73" exitCode=0 Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.866335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73"} Dec 05 12:35:37 crc kubenswrapper[4763]: I1205 12:35:37.866381 4763 scope.go:117] "RemoveContainer" containerID="ce2d3cf073a51bbeacc4e57eda2a379937fbbfe390bdbb19538e8b18beb183ad" Dec 05 12:35:38 crc kubenswrapper[4763]: I1205 12:35:38.884880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029"} Dec 05 12:35:39 crc kubenswrapper[4763]: I1205 12:35:39.896238 4763 generic.go:334] "Generic (PLEG): container finished" podID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerID="b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998" exitCode=0 Dec 05 12:35:39 crc kubenswrapper[4763]: I1205 12:35:39.896338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerDied","Data":"b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998"} Dec 05 12:35:40 crc kubenswrapper[4763]: I1205 12:35:40.907732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerStarted","Data":"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3"} Dec 05 12:35:40 crc kubenswrapper[4763]: I1205 12:35:40.930922 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxkph" podStartSLOduration=2.522085389 podStartE2EDuration="4.930895399s" podCreationTimestamp="2025-12-05 12:35:36 +0000 UTC" firstStartedPulling="2025-12-05 12:35:37.863845936 +0000 UTC m=+2822.356560679" lastFinishedPulling="2025-12-05 12:35:40.272655966 +0000 UTC m=+2824.765370689" observedRunningTime="2025-12-05 12:35:40.928103663 +0000 UTC m=+2825.420818396" watchObservedRunningTime="2025-12-05 12:35:40.930895399 +0000 UTC m=+2825.423610122" Dec 05 12:35:46 crc kubenswrapper[4763]: I1205 12:35:46.752969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:46 crc kubenswrapper[4763]: I1205 12:35:46.754090 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:46 crc kubenswrapper[4763]: I1205 12:35:46.832095 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:47 crc kubenswrapper[4763]: I1205 12:35:47.036724 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:47 crc kubenswrapper[4763]: I1205 12:35:47.092560 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.000369 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxkph" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="registry-server" containerID="cri-o://f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3" gracePeriod=2 Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.797435 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.984574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities\") pod \"fd76742e-bdf7-40da-bad8-98a6b4436d94\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.984726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nclp\" (UniqueName: \"kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp\") pod \"fd76742e-bdf7-40da-bad8-98a6b4436d94\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.984878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content\") pod \"fd76742e-bdf7-40da-bad8-98a6b4436d94\" (UID: \"fd76742e-bdf7-40da-bad8-98a6b4436d94\") " Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.985673 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities" (OuterVolumeSpecName: "utilities") pod "fd76742e-bdf7-40da-bad8-98a6b4436d94" (UID: "fd76742e-bdf7-40da-bad8-98a6b4436d94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:35:49 crc kubenswrapper[4763]: I1205 12:35:49.992491 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp" (OuterVolumeSpecName: "kube-api-access-2nclp") pod "fd76742e-bdf7-40da-bad8-98a6b4436d94" (UID: "fd76742e-bdf7-40da-bad8-98a6b4436d94"). InnerVolumeSpecName "kube-api-access-2nclp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.009445 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd76742e-bdf7-40da-bad8-98a6b4436d94" (UID: "fd76742e-bdf7-40da-bad8-98a6b4436d94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.015629 4763 generic.go:334] "Generic (PLEG): container finished" podID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerID="f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3" exitCode=0 Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.015696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerDied","Data":"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3"} Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.015709 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxkph" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.015736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxkph" event={"ID":"fd76742e-bdf7-40da-bad8-98a6b4436d94","Type":"ContainerDied","Data":"b5ba64670a947d95bf424aad61ecb493cdb3348a88ec760aaefd1e79747dce5b"} Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.015791 4763 scope.go:117] "RemoveContainer" containerID="f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.057552 4763 scope.go:117] "RemoveContainer" containerID="b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.059968 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.070264 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxkph"] Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.087205 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nclp\" (UniqueName: \"kubernetes.io/projected/fd76742e-bdf7-40da-bad8-98a6b4436d94-kube-api-access-2nclp\") on node \"crc\" DevicePath \"\"" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.087242 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.087251 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd76742e-bdf7-40da-bad8-98a6b4436d94-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.091436 4763 scope.go:117] "RemoveContainer" containerID="a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.137400 4763 scope.go:117] "RemoveContainer" containerID="f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3" Dec 05 12:35:50 crc kubenswrapper[4763]: E1205 12:35:50.140936 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3\": container with ID starting with f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3 not found: ID does not exist" containerID="f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.140978 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3"} err="failed to get container status \"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3\": rpc error: code = NotFound desc = could not find container \"f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3\": container with ID starting with f521d85602710d4678f956720600eb6dc21f705581e0b1d12efa68f39a2ddeb3 not found: ID does not exist" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.141003 4763 scope.go:117] "RemoveContainer" containerID="b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998" Dec 05 12:35:50 crc kubenswrapper[4763]: E1205 12:35:50.141270 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998\": container with ID starting with b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998 not found: ID does not exist" containerID="b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.141294 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998"} err="failed to get container status \"b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998\": rpc error: code = NotFound desc = could not find container \"b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998\": container with ID starting with b6597856ba2ccbed348676a6bdcbd5dd2feb1c3cae276d14d0f359ad471c9998 not found: ID does not exist" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.141310 4763 scope.go:117] "RemoveContainer" containerID="a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0" Dec 05 12:35:50 crc kubenswrapper[4763]: E1205 12:35:50.141587 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0\": container with ID starting with a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0 not found: ID does not exist" containerID="a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0" Dec 05 12:35:50 crc kubenswrapper[4763]: I1205 12:35:50.141613 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0"} err="failed to get container status \"a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0\": rpc error: code = NotFound desc = could not find container \"a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0\": container with ID starting with a972283599fef7881eb23fba7bc76fd7ab15acba84becf6ac8fd453727f133b0 not found: ID does not exist" Dec 05 12:35:51 crc kubenswrapper[4763]: I1205 12:35:51.795652 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" path="/var/lib/kubelet/pods/fd76742e-bdf7-40da-bad8-98a6b4436d94/volumes" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.963837 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:10 crc kubenswrapper[4763]: E1205 12:36:10.964912 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="extract-content" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.964931 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="extract-content" Dec 05 12:36:10 crc kubenswrapper[4763]: E1205 12:36:10.964994 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="registry-server" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.965004 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="registry-server" Dec 05 12:36:10 crc kubenswrapper[4763]: E1205 12:36:10.965032 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="extract-utilities" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.965040 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="extract-utilities" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.965263 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd76742e-bdf7-40da-bad8-98a6b4436d94" containerName="registry-server" Dec 05 12:36:10 crc kubenswrapper[4763]: I1205 12:36:10.966998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:10.986926 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.132360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.132714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.132832 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cgx\" (UniqueName: \"kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.168747 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.171618 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.187393 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.234992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.235324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.235432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cgx\" (UniqueName: \"kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.235577 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.236007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.262801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cgx\" (UniqueName: \"kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx\") pod \"certified-operators-x9npb\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.327019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.337998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.338287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.338422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hcbg\" (UniqueName: \"kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.440522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.440632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.440674 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hcbg\" (UniqueName: \"kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.441434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.441511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.475649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hcbg\" (UniqueName: \"kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg\") pod \"community-operators-nd6zn\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.497340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:11 crc kubenswrapper[4763]: I1205 12:36:11.929932 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:12 crc kubenswrapper[4763]: I1205 12:36:12.191850 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:12 crc kubenswrapper[4763]: W1205 12:36:12.209158 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e7ced8_bd66_4bf4_9550_a61fc6c3af14.slice/crio-225a979b485c672586bbd6de5a32ea1433fb50539701a2f22b69a332440b2cb6 WatchSource:0}: Error finding container 225a979b485c672586bbd6de5a32ea1433fb50539701a2f22b69a332440b2cb6: Status 404 returned error can't find the container with id 225a979b485c672586bbd6de5a32ea1433fb50539701a2f22b69a332440b2cb6 Dec 05 12:36:12 crc kubenswrapper[4763]: I1205 12:36:12.248642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerStarted","Data":"225a979b485c672586bbd6de5a32ea1433fb50539701a2f22b69a332440b2cb6"} Dec 05 12:36:12 crc kubenswrapper[4763]: I1205 12:36:12.251584 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerID="632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783" exitCode=0 Dec 05 12:36:12 crc kubenswrapper[4763]: I1205 12:36:12.251630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerDied","Data":"632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783"} Dec 05 12:36:12 crc kubenswrapper[4763]: I1205 12:36:12.251659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerStarted","Data":"7ff72848fe588ad04902334088bddb05d23749403d0c7a75872c0b494d8d050d"} Dec 05 12:36:13 crc kubenswrapper[4763]: I1205 12:36:13.281565 4763 generic.go:334] "Generic (PLEG): container finished" podID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerID="001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf" exitCode=0 Dec 05 12:36:13 crc kubenswrapper[4763]: I1205 12:36:13.281821 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerDied","Data":"001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf"} Dec 05 12:36:13 crc kubenswrapper[4763]: I1205 12:36:13.288578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerStarted","Data":"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c"} Dec 05 12:36:14 crc kubenswrapper[4763]: I1205 12:36:14.301403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerStarted","Data":"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9"} Dec 05 12:36:14 crc kubenswrapper[4763]: I1205 12:36:14.303849 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerID="5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c" exitCode=0 Dec 05 12:36:14 crc kubenswrapper[4763]: I1205 12:36:14.303890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerDied","Data":"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c"} Dec 05 12:36:16 crc kubenswrapper[4763]: I1205 12:36:16.325878 4763 generic.go:334] "Generic (PLEG): container finished" podID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerID="9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9" exitCode=0 Dec 05 12:36:16 crc kubenswrapper[4763]: I1205 12:36:16.325925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerDied","Data":"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9"} Dec 05 12:36:16 crc kubenswrapper[4763]: I1205 12:36:16.329267 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerStarted","Data":"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909"} Dec 05 12:36:16 crc kubenswrapper[4763]: I1205 12:36:16.373647 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x9npb" podStartSLOduration=3.621550942 podStartE2EDuration="6.373625893s" podCreationTimestamp="2025-12-05 12:36:10 +0000 UTC" firstStartedPulling="2025-12-05 12:36:12.255961329 +0000 UTC m=+2856.748676052" lastFinishedPulling="2025-12-05 12:36:15.00803628 +0000 UTC m=+2859.500751003" observedRunningTime="2025-12-05 12:36:16.365343857 +0000 UTC m=+2860.858058570" watchObservedRunningTime="2025-12-05 12:36:16.373625893 +0000 UTC m=+2860.866340626" Dec 05 12:36:17 crc kubenswrapper[4763]: I1205 12:36:17.344077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerStarted","Data":"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445"} Dec 05 12:36:17 crc kubenswrapper[4763]: I1205 12:36:17.366680 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nd6zn" podStartSLOduration=2.679832844 podStartE2EDuration="6.366662006s" podCreationTimestamp="2025-12-05 12:36:11 +0000 UTC" firstStartedPulling="2025-12-05 12:36:13.284657843 +0000 UTC m=+2857.777372566" lastFinishedPulling="2025-12-05 12:36:16.971487005 +0000 UTC m=+2861.464201728" observedRunningTime="2025-12-05 12:36:17.365127394 +0000 UTC m=+2861.857842117" watchObservedRunningTime="2025-12-05 12:36:17.366662006 +0000 UTC m=+2861.859376729" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.327680 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.328524 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.370163 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.430289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.498613 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.499624 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.549475 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:21 crc kubenswrapper[4763]: I1205 12:36:21.756037 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:22 crc kubenswrapper[4763]: I1205 12:36:22.431128 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:23 crc kubenswrapper[4763]: I1205 12:36:23.395066 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x9npb" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="registry-server" containerID="cri-o://7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909" gracePeriod=2 Dec 05 12:36:23 crc kubenswrapper[4763]: I1205 12:36:23.882081 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:23 crc kubenswrapper[4763]: I1205 12:36:23.976682 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.024528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content\") pod \"b1b31587-9975-4a79-9b24-a2b2d90ab934\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.024614 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69cgx\" (UniqueName: \"kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx\") pod \"b1b31587-9975-4a79-9b24-a2b2d90ab934\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.024641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities\") pod \"b1b31587-9975-4a79-9b24-a2b2d90ab934\" (UID: \"b1b31587-9975-4a79-9b24-a2b2d90ab934\") " Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.025812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities" (OuterVolumeSpecName: "utilities") pod "b1b31587-9975-4a79-9b24-a2b2d90ab934" (UID: "b1b31587-9975-4a79-9b24-a2b2d90ab934"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.054231 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx" (OuterVolumeSpecName: "kube-api-access-69cgx") pod "b1b31587-9975-4a79-9b24-a2b2d90ab934" (UID: "b1b31587-9975-4a79-9b24-a2b2d90ab934"). InnerVolumeSpecName "kube-api-access-69cgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.120614 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1b31587-9975-4a79-9b24-a2b2d90ab934" (UID: "b1b31587-9975-4a79-9b24-a2b2d90ab934"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.127513 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69cgx\" (UniqueName: \"kubernetes.io/projected/b1b31587-9975-4a79-9b24-a2b2d90ab934-kube-api-access-69cgx\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.127553 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.127565 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b31587-9975-4a79-9b24-a2b2d90ab934-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.411031 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerID="7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909" exitCode=0 Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.411142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerDied","Data":"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909"} Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.411306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9npb" event={"ID":"b1b31587-9975-4a79-9b24-a2b2d90ab934","Type":"ContainerDied","Data":"7ff72848fe588ad04902334088bddb05d23749403d0c7a75872c0b494d8d050d"} Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.411356 4763 scope.go:117] "RemoveContainer" containerID="7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.411544 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9npb" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.446198 4763 scope.go:117] "RemoveContainer" containerID="5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.457387 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.467082 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x9npb"] Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.475144 4763 scope.go:117] "RemoveContainer" containerID="632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.527119 4763 scope.go:117] "RemoveContainer" containerID="7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909" Dec 05 12:36:24 crc kubenswrapper[4763]: E1205 12:36:24.529469 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909\": container with ID starting with 7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909 not found: ID does not exist" containerID="7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.529515 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909"} err="failed to get container status \"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909\": rpc error: code = NotFound desc = could not find container \"7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909\": container with ID starting with 7df041547666b389fdc07b0200e5bd0b2c3868b1a7b35a64eaf36c46ddf6a909 not found: ID does not exist" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.529562 4763 scope.go:117] "RemoveContainer" containerID="5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c" Dec 05 12:36:24 crc kubenswrapper[4763]: E1205 12:36:24.532427 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c\": container with ID starting with 5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c not found: ID does not exist" containerID="5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.532454 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c"} err="failed to get container status \"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c\": rpc error: code = NotFound desc = could not find container \"5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c\": container with ID starting with 5f7432ca4210a87136c11ad2d8a113d848ec7fecc545e42531d925be6d52d75c not found: ID does not exist" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.532469 4763 scope.go:117] "RemoveContainer" containerID="632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783" Dec 05 12:36:24 crc kubenswrapper[4763]: E1205 12:36:24.532869 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783\": container with ID starting with 632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783 not found: ID does not exist" containerID="632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783" Dec 05 12:36:24 crc kubenswrapper[4763]: I1205 12:36:24.532906 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783"} err="failed to get container status \"632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783\": rpc error: code = NotFound desc = could not find container \"632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783\": container with ID starting with 632de4c0d7a07948c57f868ada79979318ee567925d4cec9c53e8b2b22bf2783 not found: ID does not exist" Dec 05 12:36:25 crc kubenswrapper[4763]: I1205 12:36:25.424790 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nd6zn" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="registry-server" containerID="cri-o://09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445" gracePeriod=2 Dec 05 12:36:25 crc kubenswrapper[4763]: I1205 12:36:25.810918 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" path="/var/lib/kubelet/pods/b1b31587-9975-4a79-9b24-a2b2d90ab934/volumes" Dec 05 12:36:25 crc kubenswrapper[4763]: I1205 12:36:25.948996 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.067743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hcbg\" (UniqueName: \"kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg\") pod \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.068064 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities\") pod \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.068282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content\") pod \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\" (UID: \"85e7ced8-bd66-4bf4-9550-a61fc6c3af14\") " Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.068990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities" (OuterVolumeSpecName: "utilities") pod "85e7ced8-bd66-4bf4-9550-a61fc6c3af14" (UID: "85e7ced8-bd66-4bf4-9550-a61fc6c3af14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.073636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg" (OuterVolumeSpecName: "kube-api-access-5hcbg") pod "85e7ced8-bd66-4bf4-9550-a61fc6c3af14" (UID: "85e7ced8-bd66-4bf4-9550-a61fc6c3af14"). InnerVolumeSpecName "kube-api-access-5hcbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.113072 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85e7ced8-bd66-4bf4-9550-a61fc6c3af14" (UID: "85e7ced8-bd66-4bf4-9550-a61fc6c3af14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.170327 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hcbg\" (UniqueName: \"kubernetes.io/projected/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-kube-api-access-5hcbg\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.170361 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.170385 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e7ced8-bd66-4bf4-9550-a61fc6c3af14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.439219 4763 generic.go:334] "Generic (PLEG): container finished" podID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerID="09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445" exitCode=0 Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.439292 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerDied","Data":"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445"} Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.439327 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd6zn" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.439358 4763 scope.go:117] "RemoveContainer" containerID="09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.439337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd6zn" event={"ID":"85e7ced8-bd66-4bf4-9550-a61fc6c3af14","Type":"ContainerDied","Data":"225a979b485c672586bbd6de5a32ea1433fb50539701a2f22b69a332440b2cb6"} Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.463672 4763 scope.go:117] "RemoveContainer" containerID="9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.489408 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.499976 4763 scope.go:117] "RemoveContainer" containerID="001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.502247 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nd6zn"] Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.542291 4763 scope.go:117] "RemoveContainer" containerID="09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445" Dec 05 12:36:26 crc kubenswrapper[4763]: E1205 12:36:26.542944 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445\": container with ID starting with 09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445 not found: ID does not exist" containerID="09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.542984 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445"} err="failed to get container status \"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445\": rpc error: code = NotFound desc = could not find container \"09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445\": container with ID starting with 09513ec62ca1f92d35af9a125a4cffd609de3f5b37709bd617d9d27815802445 not found: ID does not exist" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.543014 4763 scope.go:117] "RemoveContainer" containerID="9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9" Dec 05 12:36:26 crc kubenswrapper[4763]: E1205 12:36:26.543325 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9\": container with ID starting with 9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9 not found: ID does not exist" containerID="9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.543376 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9"} err="failed to get container status \"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9\": rpc error: code = NotFound desc = could not find container \"9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9\": container with ID starting with 9dff2e951c3861f865214d5bc0c2e123beffac07c60ec2f5ed29fa9df57e5aa9 not found: ID does not exist" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.543408 4763 scope.go:117] "RemoveContainer" containerID="001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf" Dec 05 12:36:26 crc kubenswrapper[4763]: E1205 12:36:26.543737 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf\": container with ID starting with 001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf not found: ID does not exist" containerID="001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf" Dec 05 12:36:26 crc kubenswrapper[4763]: I1205 12:36:26.543810 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf"} err="failed to get container status \"001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf\": rpc error: code = NotFound desc = could not find container \"001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf\": container with ID starting with 001c58edb47b4f335dfb3909959a90738016e45e431dee7675a3b9156fcd4edf not found: ID does not exist" Dec 05 12:36:27 crc kubenswrapper[4763]: I1205 12:36:27.795546 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" path="/var/lib/kubelet/pods/85e7ced8-bd66-4bf4-9550-a61fc6c3af14/volumes" Dec 05 12:37:05 crc kubenswrapper[4763]: I1205 12:37:05.846077 4763 generic.go:334] "Generic (PLEG): container finished" podID="f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" containerID="2cc13987f41329d0a06183bf9120483f2880d42ef2568b1217deb0099ae757e1" exitCode=0 Dec 05 12:37:05 crc kubenswrapper[4763]: I1205 12:37:05.846171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" event={"ID":"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9","Type":"ContainerDied","Data":"2cc13987f41329d0a06183bf9120483f2880d42ef2568b1217deb0099ae757e1"} Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.284936 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.465863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.465951 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.466120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.466317 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.467280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.467316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.467352 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmwtb\" (UniqueName: \"kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb\") pod \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\" (UID: \"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9\") " Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.473447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb" (OuterVolumeSpecName: "kube-api-access-rmwtb") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "kube-api-access-rmwtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.473959 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.507946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory" (OuterVolumeSpecName: "inventory") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.510538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.513483 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.516678 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.529038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" (UID: "f5d27328-7e5a-4664-9c0a-ae5c063ec8b9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576512 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576545 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576559 4763 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576574 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576594 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmwtb\" (UniqueName: \"kubernetes.io/projected/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-kube-api-access-rmwtb\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576608 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.576619 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f5d27328-7e5a-4664-9c0a-ae5c063ec8b9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.871347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" event={"ID":"f5d27328-7e5a-4664-9c0a-ae5c063ec8b9","Type":"ContainerDied","Data":"4f0112ba882282be8ec2f318df25f8124928d8d7e0533cadfdce491f18079f6e"} Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.871392 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f0112ba882282be8ec2f318df25f8124928d8d7e0533cadfdce491f18079f6e" Dec 05 12:37:07 crc kubenswrapper[4763]: I1205 12:37:07.871407 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm" Dec 05 12:37:37 crc kubenswrapper[4763]: I1205 12:37:37.543917 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:37:37 crc kubenswrapper[4763]: I1205 12:37:37.544565 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:37:47 crc kubenswrapper[4763]: I1205 12:37:47.542003 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:47 crc kubenswrapper[4763]: I1205 12:37:47.543498 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="prometheus" containerID="cri-o://dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4" gracePeriod=600 Dec 05 12:37:47 crc kubenswrapper[4763]: I1205 12:37:47.543805 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="thanos-sidecar" containerID="cri-o://5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45" gracePeriod=600 Dec 05 12:37:47 crc kubenswrapper[4763]: I1205 12:37:47.544061 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="config-reloader" containerID="cri-o://672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d" gracePeriod=600 Dec 05 12:37:48 crc kubenswrapper[4763]: I1205 12:37:48.368810 4763 generic.go:334] "Generic (PLEG): container finished" podID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerID="5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45" exitCode=0 Dec 05 12:37:48 crc kubenswrapper[4763]: I1205 12:37:48.368850 4763 generic.go:334] "Generic (PLEG): container finished" podID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerID="dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4" exitCode=0 Dec 05 12:37:48 crc kubenswrapper[4763]: I1205 12:37:48.368872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerDied","Data":"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45"} Dec 05 12:37:48 crc kubenswrapper[4763]: I1205 12:37:48.368897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerDied","Data":"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4"} Dec 05 12:37:48 crc kubenswrapper[4763]: I1205 12:37:48.788631 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.141:9090/-/ready\": dial tcp 10.217.0.141:9090: connect: connection refused" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.370150 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.382536 4763 generic.go:334] "Generic (PLEG): container finished" podID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerID="672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d" exitCode=0 Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.382626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerDied","Data":"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d"} Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.382711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"654b7fdf-0324-40f6-8681-3ba17e042d60","Type":"ContainerDied","Data":"005dd38f7ddeeb35b9da5b02caeccd38aae1a6d352b5db4e57ce832a3343fce8"} Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.382735 4763 scope.go:117] "RemoveContainer" containerID="5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.382976 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.417852 4763 scope.go:117] "RemoveContainer" containerID="672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.446295 4763 scope.go:117] "RemoveContainer" containerID="dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.482703 4763 scope.go:117] "RemoveContainer" containerID="0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.501676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.501856 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.501944 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502011 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502074 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502323 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502403 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmxpt\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.502505 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.503691 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.512518 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out" (OuterVolumeSpecName: "config-out") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.512565 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt" (OuterVolumeSpecName: "kube-api-access-zmxpt") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "kube-api-access-zmxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.516000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.517672 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.518739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.518937 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.519601 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config" (OuterVolumeSpecName: "config") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.519642 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.532900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "pvc-ff5f488f-885a-43ee-9a04-43ba44d78789". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.535645 4763 scope.go:117] "RemoveContainer" containerID="5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.535986 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45\": container with ID starting with 5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45 not found: ID does not exist" containerID="5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536015 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45"} err="failed to get container status \"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45\": rpc error: code = NotFound desc = could not find container \"5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45\": container with ID starting with 5808e5f920bc370b57aa104546b0afa329ef2b7c0fb5167d9b6493f1384a4c45 not found: ID does not exist" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536035 4763 scope.go:117] "RemoveContainer" containerID="672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.536233 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d\": container with ID starting with 672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d not found: ID does not exist" containerID="672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536255 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d"} err="failed to get container status \"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d\": rpc error: code = NotFound desc = could not find container \"672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d\": container with ID starting with 672c3492edc17d9cf373e59122f1df173d5e7f05adb36366522e24aadaf70f9d not found: ID does not exist" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536269 4763 scope.go:117] "RemoveContainer" containerID="dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.536469 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4\": container with ID starting with dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4 not found: ID does not exist" containerID="dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536495 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4"} err="failed to get container status \"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4\": rpc error: code = NotFound desc = could not find container \"dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4\": container with ID starting with dafb67c04e6ebac499400ff6e12b412be2b8a6c4f5e05694e2cbf6de669439e4 not found: ID does not exist" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536512 4763 scope.go:117] "RemoveContainer" containerID="0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.536705 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21\": container with ID starting with 0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21 not found: ID does not exist" containerID="0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.536742 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21"} err="failed to get container status \"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21\": rpc error: code = NotFound desc = could not find container \"0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21\": container with ID starting with 0b832275b7e0da18c9463bf76cf4ceae0e1a65ceb2d77cae3dc089a42420ee21 not found: ID does not exist" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.604304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config" (OuterVolumeSpecName: "web-config") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.604664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") pod \"654b7fdf-0324-40f6-8681-3ba17e042d60\" (UID: \"654b7fdf-0324-40f6-8681-3ba17e042d60\") " Dec 05 12:37:49 crc kubenswrapper[4763]: W1205 12:37:49.604902 4763 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/654b7fdf-0324-40f6-8681-3ba17e042d60/volumes/kubernetes.io~secret/web-config Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.604921 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config" (OuterVolumeSpecName: "web-config") pod "654b7fdf-0324-40f6-8681-3ba17e042d60" (UID: "654b7fdf-0324-40f6-8681-3ba17e042d60"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605274 4763 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605306 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmxpt\" (UniqueName: \"kubernetes.io/projected/654b7fdf-0324-40f6-8681-3ba17e042d60-kube-api-access-zmxpt\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605321 4763 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/654b7fdf-0324-40f6-8681-3ba17e042d60-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605355 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") on node \"crc\" " Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605370 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605384 4763 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605397 4763 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605413 4763 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/654b7fdf-0324-40f6-8681-3ba17e042d60-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605425 4763 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605437 4763 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.605446 4763 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/654b7fdf-0324-40f6-8681-3ba17e042d60-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.631390 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.631607 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ff5f488f-885a-43ee-9a04-43ba44d78789" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789") on node "crc" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.707282 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") on node \"crc\" DevicePath \"\"" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.725191 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.734844 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763421 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763864 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="extract-utilities" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763885 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="extract-utilities" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="thanos-sidecar" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763911 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="thanos-sidecar" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763929 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763942 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="config-reloader" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763949 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="config-reloader" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763969 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="extract-utilities" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763974 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="extract-utilities" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.763987 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="prometheus" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.763995 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="prometheus" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.764011 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764019 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.764030 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="init-config-reloader" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764036 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="init-config-reloader" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.764045 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764051 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.764062 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="extract-content" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764068 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="extract-content" Dec 05 12:37:49 crc kubenswrapper[4763]: E1205 12:37:49.764078 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="extract-content" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764083 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="extract-content" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764273 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b31587-9975-4a79-9b24-a2b2d90ab934" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764293 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d27328-7e5a-4664-9c0a-ae5c063ec8b9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764306 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e7ced8-bd66-4bf4-9550-a61fc6c3af14" containerName="registry-server" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764329 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="prometheus" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764337 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="config-reloader" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.764344 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" containerName="thanos-sidecar" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.767316 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.771717 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.771842 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.774523 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.774586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.776537 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-htrxx" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.782093 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.783308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.802015 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654b7fdf-0324-40f6-8681-3ba17e042d60" path="/var/lib/kubelet/pods/654b7fdf-0324-40f6-8681-3ba17e042d60/volumes" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.910952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7q7\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-kube-api-access-lh7q7\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aac1b695-9685-4f3f-bc5d-d1262bb44992-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911371 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aac1b695-9685-4f3f-bc5d-d1262bb44992-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911435 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:49 crc kubenswrapper[4763]: I1205 12:37:49.911455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013514 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013617 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7q7\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-kube-api-access-lh7q7\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013648 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013824 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aac1b695-9685-4f3f-bc5d-d1262bb44992-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.013884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aac1b695-9685-4f3f-bc5d-d1262bb44992-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.015251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aac1b695-9685-4f3f-bc5d-d1262bb44992-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.033411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.033427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.033529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.033557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aac1b695-9685-4f3f-bc5d-d1262bb44992-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.033592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.034222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.034525 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.041692 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac1b695-9685-4f3f-bc5d-d1262bb44992-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.053362 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7q7\" (UniqueName: \"kubernetes.io/projected/aac1b695-9685-4f3f-bc5d-d1262bb44992-kube-api-access-lh7q7\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.055710 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.055754 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f78e37f2b579c8ad937827f0ee3e8c91bfbfeaf465b3046f4f7d0e3c34229d24/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.333526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff5f488f-885a-43ee-9a04-43ba44d78789\") pod \"prometheus-metric-storage-0\" (UID: \"aac1b695-9685-4f3f-bc5d-d1262bb44992\") " pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.385729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 12:37:50 crc kubenswrapper[4763]: I1205 12:37:50.876051 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 12:37:51 crc kubenswrapper[4763]: I1205 12:37:51.415446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerStarted","Data":"f07b4d406fa8f3d65c185a394a78ab9b9198baa3b70b2c3fb8f0e6db871e4b20"} Dec 05 12:37:55 crc kubenswrapper[4763]: I1205 12:37:55.457144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerStarted","Data":"208e63688e16ee4e817bb74b2fc7d006630a78659e90cfac21df6ccb9bc5b2b5"} Dec 05 12:38:03 crc kubenswrapper[4763]: I1205 12:38:03.533238 4763 generic.go:334] "Generic (PLEG): container finished" podID="aac1b695-9685-4f3f-bc5d-d1262bb44992" containerID="208e63688e16ee4e817bb74b2fc7d006630a78659e90cfac21df6ccb9bc5b2b5" exitCode=0 Dec 05 12:38:03 crc kubenswrapper[4763]: I1205 12:38:03.533355 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerDied","Data":"208e63688e16ee4e817bb74b2fc7d006630a78659e90cfac21df6ccb9bc5b2b5"} Dec 05 12:38:04 crc kubenswrapper[4763]: I1205 12:38:04.544549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerStarted","Data":"aa7bf5a9870d7aecef1e3b504a1eb2fccc414c3cb81f4e3be1187505db495ccb"} Dec 05 12:38:07 crc kubenswrapper[4763]: I1205 12:38:07.543639 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:38:07 crc kubenswrapper[4763]: I1205 12:38:07.544434 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:38:07 crc kubenswrapper[4763]: I1205 12:38:07.582940 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerStarted","Data":"0a4ea23ee6fbc7e5e2fe6eb4b7dc1b19e1021dd2b975b2b26d21d6928ba215d6"} Dec 05 12:38:07 crc kubenswrapper[4763]: I1205 12:38:07.583532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"aac1b695-9685-4f3f-bc5d-d1262bb44992","Type":"ContainerStarted","Data":"3ebcb07719da9a2f9952777f6b552bb0b29b5378d3bd20ecc2118a1be95cb227"} Dec 05 12:38:10 crc kubenswrapper[4763]: I1205 12:38:10.386108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 12:38:20 crc kubenswrapper[4763]: I1205 12:38:20.386402 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 12:38:20 crc kubenswrapper[4763]: I1205 12:38:20.394647 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 12:38:20 crc kubenswrapper[4763]: I1205 12:38:20.429336 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=31.429304338 podStartE2EDuration="31.429304338s" podCreationTimestamp="2025-12-05 12:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:07.617066642 +0000 UTC m=+2972.109781385" watchObservedRunningTime="2025-12-05 12:38:20.429304338 +0000 UTC m=+2984.922019071" Dec 05 12:38:20 crc kubenswrapper[4763]: I1205 12:38:20.732414 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.544007 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.544531 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.544577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.545354 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.545451 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" gracePeriod=600 Dec 05 12:38:37 crc kubenswrapper[4763]: E1205 12:38:37.672619 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.917629 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" exitCode=0 Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.917691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029"} Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.917751 4763 scope.go:117] "RemoveContainer" containerID="9235b49df516b88937632a754bf0fe47bb84aad68ded096bb5e12cefccd93a73" Dec 05 12:38:37 crc kubenswrapper[4763]: I1205 12:38:37.918495 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:38:37 crc kubenswrapper[4763]: E1205 12:38:37.918921 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.230158 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.231633 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.242718 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.243199 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.250898 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.254417 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m2cn4" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.254543 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.259882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.259925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.259964 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362481 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362545 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jx49\" (UniqueName: \"kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362603 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362654 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.362747 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.364451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.364529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.370108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.464412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jx49\" (UniqueName: \"kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.464847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.464902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.464957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.465012 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.465041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.465385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.465550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.465858 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.470586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.471617 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.482752 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jx49\" (UniqueName: \"kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.495055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " pod="openstack/tempest-tests-tempest" Dec 05 12:38:39 crc kubenswrapper[4763]: I1205 12:38:39.597030 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 12:38:40 crc kubenswrapper[4763]: I1205 12:38:40.027695 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 12:38:40 crc kubenswrapper[4763]: I1205 12:38:40.033863 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:38:40 crc kubenswrapper[4763]: I1205 12:38:40.951743 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"295e994b-9be5-4486-beb7-6be00576c5c3","Type":"ContainerStarted","Data":"44c4d50882b8243ab7ab046f510d3f526012c7d23e65a4410d8f0c523d8c7a0b"} Dec 05 12:38:51 crc kubenswrapper[4763]: I1205 12:38:51.784612 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:38:51 crc kubenswrapper[4763]: E1205 12:38:51.785425 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:38:54 crc kubenswrapper[4763]: I1205 12:38:54.152728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"295e994b-9be5-4486-beb7-6be00576c5c3","Type":"ContainerStarted","Data":"800f0f10e1abe21dc6010877c8872277d05b9280c81ed98f5009c55c4f922ab7"} Dec 05 12:38:54 crc kubenswrapper[4763]: I1205 12:38:54.179058 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.599534163 podStartE2EDuration="16.179032542s" podCreationTimestamp="2025-12-05 12:38:38 +0000 UTC" firstStartedPulling="2025-12-05 12:38:40.033566307 +0000 UTC m=+3004.526281030" lastFinishedPulling="2025-12-05 12:38:52.613064686 +0000 UTC m=+3017.105779409" observedRunningTime="2025-12-05 12:38:54.170973905 +0000 UTC m=+3018.663688648" watchObservedRunningTime="2025-12-05 12:38:54.179032542 +0000 UTC m=+3018.671747275" Dec 05 12:39:02 crc kubenswrapper[4763]: I1205 12:39:02.784105 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:39:02 crc kubenswrapper[4763]: E1205 12:39:02.786515 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:39:13 crc kubenswrapper[4763]: I1205 12:39:13.784633 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:39:13 crc kubenswrapper[4763]: E1205 12:39:13.786219 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:39:28 crc kubenswrapper[4763]: I1205 12:39:28.784350 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:39:28 crc kubenswrapper[4763]: E1205 12:39:28.786241 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:39:41 crc kubenswrapper[4763]: I1205 12:39:41.784173 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:39:41 crc kubenswrapper[4763]: E1205 12:39:41.785345 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:39:55 crc kubenswrapper[4763]: I1205 12:39:55.793426 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:39:55 crc kubenswrapper[4763]: E1205 12:39:55.794491 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:40:07 crc kubenswrapper[4763]: I1205 12:40:07.784709 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:40:07 crc kubenswrapper[4763]: E1205 12:40:07.785534 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:40:22 crc kubenswrapper[4763]: I1205 12:40:22.784547 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:40:22 crc kubenswrapper[4763]: E1205 12:40:22.785540 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:40:36 crc kubenswrapper[4763]: I1205 12:40:36.783849 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:40:36 crc kubenswrapper[4763]: E1205 12:40:36.784872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:40:51 crc kubenswrapper[4763]: I1205 12:40:51.784940 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:40:51 crc kubenswrapper[4763]: E1205 12:40:51.788706 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:41:05 crc kubenswrapper[4763]: I1205 12:41:05.806589 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:41:05 crc kubenswrapper[4763]: E1205 12:41:05.807476 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:41:16 crc kubenswrapper[4763]: I1205 12:41:16.785129 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:41:16 crc kubenswrapper[4763]: E1205 12:41:16.786234 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:41:30 crc kubenswrapper[4763]: I1205 12:41:30.784135 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:41:30 crc kubenswrapper[4763]: E1205 12:41:30.784982 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:41:44 crc kubenswrapper[4763]: I1205 12:41:44.784200 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:41:44 crc kubenswrapper[4763]: E1205 12:41:44.785364 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:41:56 crc kubenswrapper[4763]: I1205 12:41:56.784514 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:41:56 crc kubenswrapper[4763]: E1205 12:41:56.785457 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:42:11 crc kubenswrapper[4763]: I1205 12:42:11.784647 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:42:11 crc kubenswrapper[4763]: E1205 12:42:11.786174 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:42:23 crc kubenswrapper[4763]: I1205 12:42:23.784479 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:42:23 crc kubenswrapper[4763]: E1205 12:42:23.785230 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.540390 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.543915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.557089 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.677714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.677790 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh7p\" (UniqueName: \"kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.678005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.779956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh7p\" (UniqueName: \"kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.780037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.780185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.780690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.780688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.799010 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh7p\" (UniqueName: \"kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p\") pod \"redhat-operators-nvkpd\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:34 crc kubenswrapper[4763]: I1205 12:42:34.879664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:35 crc kubenswrapper[4763]: I1205 12:42:35.406186 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:35 crc kubenswrapper[4763]: W1205 12:42:35.413028 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a602ec_84ad_4841_ae7b_342f1566e616.slice/crio-cb71207e97ed9c4d7c513729a040f57e2d7a1c5fd2038f6e5faf820094808a58 WatchSource:0}: Error finding container cb71207e97ed9c4d7c513729a040f57e2d7a1c5fd2038f6e5faf820094808a58: Status 404 returned error can't find the container with id cb71207e97ed9c4d7c513729a040f57e2d7a1c5fd2038f6e5faf820094808a58 Dec 05 12:42:35 crc kubenswrapper[4763]: I1205 12:42:35.817378 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a602ec-84ad-4841-ae7b-342f1566e616" containerID="1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131" exitCode=0 Dec 05 12:42:35 crc kubenswrapper[4763]: I1205 12:42:35.817465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerDied","Data":"1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131"} Dec 05 12:42:35 crc kubenswrapper[4763]: I1205 12:42:35.817705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerStarted","Data":"cb71207e97ed9c4d7c513729a040f57e2d7a1c5fd2038f6e5faf820094808a58"} Dec 05 12:42:36 crc kubenswrapper[4763]: I1205 12:42:36.827713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerStarted","Data":"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6"} Dec 05 12:42:37 crc kubenswrapper[4763]: I1205 12:42:37.791106 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:42:37 crc kubenswrapper[4763]: E1205 12:42:37.791689 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:42:38 crc kubenswrapper[4763]: I1205 12:42:38.856056 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a602ec-84ad-4841-ae7b-342f1566e616" containerID="a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6" exitCode=0 Dec 05 12:42:38 crc kubenswrapper[4763]: I1205 12:42:38.856456 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerDied","Data":"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6"} Dec 05 12:42:39 crc kubenswrapper[4763]: I1205 12:42:39.870220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerStarted","Data":"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b"} Dec 05 12:42:39 crc kubenswrapper[4763]: I1205 12:42:39.902536 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nvkpd" podStartSLOduration=2.449739319 podStartE2EDuration="5.90251063s" podCreationTimestamp="2025-12-05 12:42:34 +0000 UTC" firstStartedPulling="2025-12-05 12:42:35.819499739 +0000 UTC m=+3240.312214462" lastFinishedPulling="2025-12-05 12:42:39.27227105 +0000 UTC m=+3243.764985773" observedRunningTime="2025-12-05 12:42:39.895012548 +0000 UTC m=+3244.387727271" watchObservedRunningTime="2025-12-05 12:42:39.90251063 +0000 UTC m=+3244.395225353" Dec 05 12:42:44 crc kubenswrapper[4763]: I1205 12:42:44.879832 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:44 crc kubenswrapper[4763]: I1205 12:42:44.880479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:44 crc kubenswrapper[4763]: I1205 12:42:44.936237 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:44 crc kubenswrapper[4763]: I1205 12:42:44.995530 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:45 crc kubenswrapper[4763]: I1205 12:42:45.178837 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:46 crc kubenswrapper[4763]: I1205 12:42:46.948289 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nvkpd" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="registry-server" containerID="cri-o://0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b" gracePeriod=2 Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.450446 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.569359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content\") pod \"04a602ec-84ad-4841-ae7b-342f1566e616\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.570090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckh7p\" (UniqueName: \"kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p\") pod \"04a602ec-84ad-4841-ae7b-342f1566e616\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.570189 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities\") pod \"04a602ec-84ad-4841-ae7b-342f1566e616\" (UID: \"04a602ec-84ad-4841-ae7b-342f1566e616\") " Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.571451 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities" (OuterVolumeSpecName: "utilities") pod "04a602ec-84ad-4841-ae7b-342f1566e616" (UID: "04a602ec-84ad-4841-ae7b-342f1566e616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.579872 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p" (OuterVolumeSpecName: "kube-api-access-ckh7p") pod "04a602ec-84ad-4841-ae7b-342f1566e616" (UID: "04a602ec-84ad-4841-ae7b-342f1566e616"). InnerVolumeSpecName "kube-api-access-ckh7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.672984 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckh7p\" (UniqueName: \"kubernetes.io/projected/04a602ec-84ad-4841-ae7b-342f1566e616-kube-api-access-ckh7p\") on node \"crc\" DevicePath \"\"" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.673021 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.710154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04a602ec-84ad-4841-ae7b-342f1566e616" (UID: "04a602ec-84ad-4841-ae7b-342f1566e616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.780081 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a602ec-84ad-4841-ae7b-342f1566e616-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.961608 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a602ec-84ad-4841-ae7b-342f1566e616" containerID="0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b" exitCode=0 Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.961654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerDied","Data":"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b"} Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.961680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvkpd" event={"ID":"04a602ec-84ad-4841-ae7b-342f1566e616","Type":"ContainerDied","Data":"cb71207e97ed9c4d7c513729a040f57e2d7a1c5fd2038f6e5faf820094808a58"} Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.961697 4763 scope.go:117] "RemoveContainer" containerID="0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.962343 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvkpd" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.985819 4763 scope.go:117] "RemoveContainer" containerID="a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6" Dec 05 12:42:47 crc kubenswrapper[4763]: I1205 12:42:47.994482 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.009384 4763 scope.go:117] "RemoveContainer" containerID="1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.010281 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nvkpd"] Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.068509 4763 scope.go:117] "RemoveContainer" containerID="0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b" Dec 05 12:42:48 crc kubenswrapper[4763]: E1205 12:42:48.069056 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b\": container with ID starting with 0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b not found: ID does not exist" containerID="0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.069102 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b"} err="failed to get container status \"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b\": rpc error: code = NotFound desc = could not find container \"0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b\": container with ID starting with 0b92b39d48228b8fb35eb54408ab084ff90768c94cf71e43610f56fef47bc05b not found: ID does not exist" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.069131 4763 scope.go:117] "RemoveContainer" containerID="a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6" Dec 05 12:42:48 crc kubenswrapper[4763]: E1205 12:42:48.069510 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6\": container with ID starting with a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6 not found: ID does not exist" containerID="a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.069546 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6"} err="failed to get container status \"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6\": rpc error: code = NotFound desc = could not find container \"a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6\": container with ID starting with a4f8e01e3430954e950429caa50e2fe735b585f6e89bbd628638c6f1db3a89f6 not found: ID does not exist" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.069565 4763 scope.go:117] "RemoveContainer" containerID="1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131" Dec 05 12:42:48 crc kubenswrapper[4763]: E1205 12:42:48.069836 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131\": container with ID starting with 1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131 not found: ID does not exist" containerID="1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131" Dec 05 12:42:48 crc kubenswrapper[4763]: I1205 12:42:48.069857 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131"} err="failed to get container status \"1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131\": rpc error: code = NotFound desc = could not find container \"1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131\": container with ID starting with 1c31e56e4bffa1450b604b0f3189916efc074604217a988f79438cc04b44c131 not found: ID does not exist" Dec 05 12:42:49 crc kubenswrapper[4763]: I1205 12:42:49.795073 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" path="/var/lib/kubelet/pods/04a602ec-84ad-4841-ae7b-342f1566e616/volumes" Dec 05 12:42:52 crc kubenswrapper[4763]: I1205 12:42:52.784247 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:42:52 crc kubenswrapper[4763]: E1205 12:42:52.785263 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:43:05 crc kubenswrapper[4763]: I1205 12:43:05.794281 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:43:05 crc kubenswrapper[4763]: E1205 12:43:05.794977 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:43:20 crc kubenswrapper[4763]: I1205 12:43:20.784575 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:43:20 crc kubenswrapper[4763]: E1205 12:43:20.785446 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:43:35 crc kubenswrapper[4763]: I1205 12:43:35.793922 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:43:35 crc kubenswrapper[4763]: E1205 12:43:35.794851 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:43:48 crc kubenswrapper[4763]: I1205 12:43:48.784753 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:43:49 crc kubenswrapper[4763]: I1205 12:43:49.563385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a"} Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.159122 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz"] Dec 05 12:45:00 crc kubenswrapper[4763]: E1205 12:45:00.160918 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="extract-utilities" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.160938 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="extract-utilities" Dec 05 12:45:00 crc kubenswrapper[4763]: E1205 12:45:00.160994 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="extract-content" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.161006 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="extract-content" Dec 05 12:45:00 crc kubenswrapper[4763]: E1205 12:45:00.161034 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="registry-server" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.161044 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="registry-server" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.161336 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a602ec-84ad-4841-ae7b-342f1566e616" containerName="registry-server" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.162570 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.169231 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.169494 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.180973 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz"] Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.183417 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.183843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6gf\" (UniqueName: \"kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.184056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.285839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6gf\" (UniqueName: \"kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.286211 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.286301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.287679 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.303888 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.304253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6gf\" (UniqueName: \"kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf\") pod \"collect-profiles-29415645-77hjz\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:00 crc kubenswrapper[4763]: I1205 12:45:00.512309 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:01 crc kubenswrapper[4763]: I1205 12:45:01.023697 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz"] Dec 05 12:45:01 crc kubenswrapper[4763]: I1205 12:45:01.272275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" event={"ID":"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830","Type":"ContainerStarted","Data":"1f49393cf8dd15fb9a0ef3c2f07928ae7b982f41d3ca42b9521d10ecd102d23d"} Dec 05 12:45:01 crc kubenswrapper[4763]: I1205 12:45:01.272782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" event={"ID":"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830","Type":"ContainerStarted","Data":"30c7bbb8b56a95745994bff3c48ea5d6d32b7a4d71c77c37ab8374b780d6cdaf"} Dec 05 12:45:01 crc kubenswrapper[4763]: I1205 12:45:01.290219 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" podStartSLOduration=1.290121992 podStartE2EDuration="1.290121992s" podCreationTimestamp="2025-12-05 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:45:01.288526019 +0000 UTC m=+3385.781240742" watchObservedRunningTime="2025-12-05 12:45:01.290121992 +0000 UTC m=+3385.782836715" Dec 05 12:45:02 crc kubenswrapper[4763]: I1205 12:45:02.285712 4763 generic.go:334] "Generic (PLEG): container finished" podID="e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" containerID="1f49393cf8dd15fb9a0ef3c2f07928ae7b982f41d3ca42b9521d10ecd102d23d" exitCode=0 Dec 05 12:45:02 crc kubenswrapper[4763]: I1205 12:45:02.285827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" event={"ID":"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830","Type":"ContainerDied","Data":"1f49393cf8dd15fb9a0ef3c2f07928ae7b982f41d3ca42b9521d10ecd102d23d"} Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.866995 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.977790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume\") pod \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.977894 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6gf\" (UniqueName: \"kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf\") pod \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.978068 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume\") pod \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\" (UID: \"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830\") " Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.978925 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" (UID: "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.986384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf" (OuterVolumeSpecName: "kube-api-access-9n6gf") pod "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" (UID: "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830"). InnerVolumeSpecName "kube-api-access-9n6gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:45:03 crc kubenswrapper[4763]: I1205 12:45:03.986548 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" (UID: "e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.080651 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.080721 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6gf\" (UniqueName: \"kubernetes.io/projected/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-kube-api-access-9n6gf\") on node \"crc\" DevicePath \"\"" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.080739 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.331123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" event={"ID":"e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830","Type":"ContainerDied","Data":"30c7bbb8b56a95745994bff3c48ea5d6d32b7a4d71c77c37ab8374b780d6cdaf"} Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.331185 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c7bbb8b56a95745994bff3c48ea5d6d32b7a4d71c77c37ab8374b780d6cdaf" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.331275 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz" Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.387730 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk"] Dec 05 12:45:04 crc kubenswrapper[4763]: I1205 12:45:04.400327 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415600-7v5hk"] Dec 05 12:45:05 crc kubenswrapper[4763]: I1205 12:45:05.795844 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383179f3-9d26-4bf0-ae07-1c96400ecf60" path="/var/lib/kubelet/pods/383179f3-9d26-4bf0-ae07-1c96400ecf60/volumes" Dec 05 12:45:52 crc kubenswrapper[4763]: I1205 12:45:52.692545 4763 scope.go:117] "RemoveContainer" containerID="284b2ace5698f5ba314d5d1742d62f11ca2acf78da9116d53ff093f544f1857d" Dec 05 12:46:07 crc kubenswrapper[4763]: I1205 12:46:07.544425 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:46:07 crc kubenswrapper[4763]: I1205 12:46:07.544986 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.005092 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:19 crc kubenswrapper[4763]: E1205 12:46:19.006205 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" containerName="collect-profiles" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.006220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" containerName="collect-profiles" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.006438 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" containerName="collect-profiles" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.007973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.018191 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.169823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzftp\" (UniqueName: \"kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.170383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.170542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.272372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.272505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzftp\" (UniqueName: \"kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.272563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.273106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.273973 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.295381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzftp\" (UniqueName: \"kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp\") pod \"redhat-marketplace-7txmm\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.344003 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:19 crc kubenswrapper[4763]: I1205 12:46:19.849794 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:20 crc kubenswrapper[4763]: I1205 12:46:20.103562 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerID="8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187" exitCode=0 Dec 05 12:46:20 crc kubenswrapper[4763]: I1205 12:46:20.103698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerDied","Data":"8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187"} Dec 05 12:46:20 crc kubenswrapper[4763]: I1205 12:46:20.104052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerStarted","Data":"5112f240f09d2aa8aa258def1100728430eda46acf05aee4cc8da57f598340ba"} Dec 05 12:46:20 crc kubenswrapper[4763]: I1205 12:46:20.107804 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:46:22 crc kubenswrapper[4763]: I1205 12:46:22.130616 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerID="85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d" exitCode=0 Dec 05 12:46:22 crc kubenswrapper[4763]: I1205 12:46:22.130835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerDied","Data":"85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d"} Dec 05 12:46:23 crc kubenswrapper[4763]: I1205 12:46:23.142993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerStarted","Data":"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92"} Dec 05 12:46:23 crc kubenswrapper[4763]: I1205 12:46:23.171738 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7txmm" podStartSLOduration=2.6546292449999997 podStartE2EDuration="5.17171373s" podCreationTimestamp="2025-12-05 12:46:18 +0000 UTC" firstStartedPulling="2025-12-05 12:46:20.107472809 +0000 UTC m=+3464.600187532" lastFinishedPulling="2025-12-05 12:46:22.624557294 +0000 UTC m=+3467.117272017" observedRunningTime="2025-12-05 12:46:23.16174336 +0000 UTC m=+3467.654458083" watchObservedRunningTime="2025-12-05 12:46:23.17171373 +0000 UTC m=+3467.664428453" Dec 05 12:46:29 crc kubenswrapper[4763]: I1205 12:46:29.344607 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:29 crc kubenswrapper[4763]: I1205 12:46:29.345228 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:29 crc kubenswrapper[4763]: I1205 12:46:29.400024 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:30 crc kubenswrapper[4763]: I1205 12:46:30.257610 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:30 crc kubenswrapper[4763]: I1205 12:46:30.306440 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.231880 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7txmm" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="registry-server" containerID="cri-o://939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92" gracePeriod=2 Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.738605 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.875926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content\") pod \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.876057 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities\") pod \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.876360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzftp\" (UniqueName: \"kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp\") pod \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\" (UID: \"1a9a7a18-1f4d-43d0-be45-b39d88376c14\") " Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.876948 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities" (OuterVolumeSpecName: "utilities") pod "1a9a7a18-1f4d-43d0-be45-b39d88376c14" (UID: "1a9a7a18-1f4d-43d0-be45-b39d88376c14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.878435 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.883210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp" (OuterVolumeSpecName: "kube-api-access-bzftp") pod "1a9a7a18-1f4d-43d0-be45-b39d88376c14" (UID: "1a9a7a18-1f4d-43d0-be45-b39d88376c14"). InnerVolumeSpecName "kube-api-access-bzftp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.894146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a9a7a18-1f4d-43d0-be45-b39d88376c14" (UID: "1a9a7a18-1f4d-43d0-be45-b39d88376c14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.980244 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzftp\" (UniqueName: \"kubernetes.io/projected/1a9a7a18-1f4d-43d0-be45-b39d88376c14-kube-api-access-bzftp\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:32 crc kubenswrapper[4763]: I1205 12:46:32.980692 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9a7a18-1f4d-43d0-be45-b39d88376c14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.246035 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerID="939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92" exitCode=0 Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.246091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerDied","Data":"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92"} Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.246126 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7txmm" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.246142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7txmm" event={"ID":"1a9a7a18-1f4d-43d0-be45-b39d88376c14","Type":"ContainerDied","Data":"5112f240f09d2aa8aa258def1100728430eda46acf05aee4cc8da57f598340ba"} Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.246165 4763 scope.go:117] "RemoveContainer" containerID="939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.273800 4763 scope.go:117] "RemoveContainer" containerID="85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.298772 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.306080 4763 scope.go:117] "RemoveContainer" containerID="8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.307852 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7txmm"] Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.357489 4763 scope.go:117] "RemoveContainer" containerID="939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92" Dec 05 12:46:33 crc kubenswrapper[4763]: E1205 12:46:33.358074 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92\": container with ID starting with 939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92 not found: ID does not exist" containerID="939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.358216 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92"} err="failed to get container status \"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92\": rpc error: code = NotFound desc = could not find container \"939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92\": container with ID starting with 939ca8bf900e6db1538143cf78c1667e35fdc9f501ddafc916d86450c4565b92 not found: ID does not exist" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.358315 4763 scope.go:117] "RemoveContainer" containerID="85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d" Dec 05 12:46:33 crc kubenswrapper[4763]: E1205 12:46:33.358850 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d\": container with ID starting with 85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d not found: ID does not exist" containerID="85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.358917 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d"} err="failed to get container status \"85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d\": rpc error: code = NotFound desc = could not find container \"85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d\": container with ID starting with 85a3f844e183d555197e1cd47c02196cf967e77160648ab087e0339059042c0d not found: ID does not exist" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.358961 4763 scope.go:117] "RemoveContainer" containerID="8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187" Dec 05 12:46:33 crc kubenswrapper[4763]: E1205 12:46:33.359554 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187\": container with ID starting with 8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187 not found: ID does not exist" containerID="8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.359706 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187"} err="failed to get container status \"8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187\": rpc error: code = NotFound desc = could not find container \"8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187\": container with ID starting with 8c29fb36882cadc7485c4dea7a3d0514f99bf4a9ca9bc265b30a2c67b9d29187 not found: ID does not exist" Dec 05 12:46:33 crc kubenswrapper[4763]: I1205 12:46:33.808881 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" path="/var/lib/kubelet/pods/1a9a7a18-1f4d-43d0-be45-b39d88376c14/volumes" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.265533 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:36 crc kubenswrapper[4763]: E1205 12:46:36.267321 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="extract-utilities" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.267338 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="extract-utilities" Dec 05 12:46:36 crc kubenswrapper[4763]: E1205 12:46:36.267377 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="registry-server" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.269705 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="registry-server" Dec 05 12:46:36 crc kubenswrapper[4763]: E1205 12:46:36.269749 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="extract-content" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.269779 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="extract-content" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.270041 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9a7a18-1f4d-43d0-be45-b39d88376c14" containerName="registry-server" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.272180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.282846 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.379407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.379652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvv9f\" (UniqueName: \"kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.380059 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.482496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvv9f\" (UniqueName: \"kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.482632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.482672 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.483522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.483681 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.513123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvv9f\" (UniqueName: \"kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f\") pod \"community-operators-9vz24\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:36 crc kubenswrapper[4763]: I1205 12:46:36.605818 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:37 crc kubenswrapper[4763]: I1205 12:46:37.183728 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:37 crc kubenswrapper[4763]: W1205 12:46:37.190190 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9c56b4_dd2f_4227_902d_0cd07745e365.slice/crio-92b6c50b47c4c15c147c681f5e84bb1106def4c1f6e8a7abfb5c87697da12b91 WatchSource:0}: Error finding container 92b6c50b47c4c15c147c681f5e84bb1106def4c1f6e8a7abfb5c87697da12b91: Status 404 returned error can't find the container with id 92b6c50b47c4c15c147c681f5e84bb1106def4c1f6e8a7abfb5c87697da12b91 Dec 05 12:46:37 crc kubenswrapper[4763]: I1205 12:46:37.290673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerStarted","Data":"92b6c50b47c4c15c147c681f5e84bb1106def4c1f6e8a7abfb5c87697da12b91"} Dec 05 12:46:37 crc kubenswrapper[4763]: I1205 12:46:37.544313 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:46:37 crc kubenswrapper[4763]: I1205 12:46:37.544386 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:46:38 crc kubenswrapper[4763]: I1205 12:46:38.306526 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerID="fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd" exitCode=0 Dec 05 12:46:38 crc kubenswrapper[4763]: I1205 12:46:38.306648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerDied","Data":"fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd"} Dec 05 12:46:39 crc kubenswrapper[4763]: I1205 12:46:39.321651 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerStarted","Data":"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1"} Dec 05 12:46:40 crc kubenswrapper[4763]: I1205 12:46:40.332839 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerID="f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1" exitCode=0 Dec 05 12:46:40 crc kubenswrapper[4763]: I1205 12:46:40.333169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerDied","Data":"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1"} Dec 05 12:46:41 crc kubenswrapper[4763]: I1205 12:46:41.342504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerStarted","Data":"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8"} Dec 05 12:46:41 crc kubenswrapper[4763]: I1205 12:46:41.376097 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vz24" podStartSLOduration=2.827965616 podStartE2EDuration="5.376077789s" podCreationTimestamp="2025-12-05 12:46:36 +0000 UTC" firstStartedPulling="2025-12-05 12:46:38.309207478 +0000 UTC m=+3482.801922201" lastFinishedPulling="2025-12-05 12:46:40.857319631 +0000 UTC m=+3485.350034374" observedRunningTime="2025-12-05 12:46:41.367364793 +0000 UTC m=+3485.860079526" watchObservedRunningTime="2025-12-05 12:46:41.376077789 +0000 UTC m=+3485.868792522" Dec 05 12:46:46 crc kubenswrapper[4763]: I1205 12:46:46.607958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:46 crc kubenswrapper[4763]: I1205 12:46:46.610092 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:46 crc kubenswrapper[4763]: I1205 12:46:46.663991 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:47 crc kubenswrapper[4763]: I1205 12:46:47.459850 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:47 crc kubenswrapper[4763]: I1205 12:46:47.511139 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:49 crc kubenswrapper[4763]: I1205 12:46:49.841322 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9vz24" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="registry-server" containerID="cri-o://7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8" gracePeriod=2 Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.354930 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.550465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvv9f\" (UniqueName: \"kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f\") pod \"aa9c56b4-dd2f-4227-902d-0cd07745e365\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.550813 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content\") pod \"aa9c56b4-dd2f-4227-902d-0cd07745e365\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.550972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities\") pod \"aa9c56b4-dd2f-4227-902d-0cd07745e365\" (UID: \"aa9c56b4-dd2f-4227-902d-0cd07745e365\") " Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.552447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities" (OuterVolumeSpecName: "utilities") pod "aa9c56b4-dd2f-4227-902d-0cd07745e365" (UID: "aa9c56b4-dd2f-4227-902d-0cd07745e365"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.557580 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f" (OuterVolumeSpecName: "kube-api-access-qvv9f") pod "aa9c56b4-dd2f-4227-902d-0cd07745e365" (UID: "aa9c56b4-dd2f-4227-902d-0cd07745e365"). InnerVolumeSpecName "kube-api-access-qvv9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.600380 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa9c56b4-dd2f-4227-902d-0cd07745e365" (UID: "aa9c56b4-dd2f-4227-902d-0cd07745e365"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.653092 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.653146 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvv9f\" (UniqueName: \"kubernetes.io/projected/aa9c56b4-dd2f-4227-902d-0cd07745e365-kube-api-access-qvv9f\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.653159 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9c56b4-dd2f-4227-902d-0cd07745e365-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.853277 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerID="7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8" exitCode=0 Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.853515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerDied","Data":"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8"} Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.853593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vz24" event={"ID":"aa9c56b4-dd2f-4227-902d-0cd07745e365","Type":"ContainerDied","Data":"92b6c50b47c4c15c147c681f5e84bb1106def4c1f6e8a7abfb5c87697da12b91"} Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.853617 4763 scope.go:117] "RemoveContainer" containerID="7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.853684 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vz24" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.886820 4763 scope.go:117] "RemoveContainer" containerID="f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.900259 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.931083 4763 scope.go:117] "RemoveContainer" containerID="fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.934480 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9vz24"] Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.991720 4763 scope.go:117] "RemoveContainer" containerID="7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8" Dec 05 12:46:50 crc kubenswrapper[4763]: E1205 12:46:50.992453 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8\": container with ID starting with 7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8 not found: ID does not exist" containerID="7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.992538 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8"} err="failed to get container status \"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8\": rpc error: code = NotFound desc = could not find container \"7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8\": container with ID starting with 7989fb0775f564dd3c8c3d545a44f1c6d430193979580d53c8553b61f4997ea8 not found: ID does not exist" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.992599 4763 scope.go:117] "RemoveContainer" containerID="f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1" Dec 05 12:46:50 crc kubenswrapper[4763]: E1205 12:46:50.993211 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1\": container with ID starting with f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1 not found: ID does not exist" containerID="f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.993304 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1"} err="failed to get container status \"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1\": rpc error: code = NotFound desc = could not find container \"f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1\": container with ID starting with f79a9ea56a8290606e450f5049e12eaea3a2eaf95b189f52975bbcf69f6c3bc1 not found: ID does not exist" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.993343 4763 scope.go:117] "RemoveContainer" containerID="fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd" Dec 05 12:46:50 crc kubenswrapper[4763]: E1205 12:46:50.993935 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd\": container with ID starting with fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd not found: ID does not exist" containerID="fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd" Dec 05 12:46:50 crc kubenswrapper[4763]: I1205 12:46:50.993982 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd"} err="failed to get container status \"fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd\": rpc error: code = NotFound desc = could not find container \"fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd\": container with ID starting with fd44020d412a21c36c64ed3e8c880ef12d3184fde9d6b48674d284bbcbf9f6bd not found: ID does not exist" Dec 05 12:46:51 crc kubenswrapper[4763]: I1205 12:46:51.795150 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" path="/var/lib/kubelet/pods/aa9c56b4-dd2f-4227-902d-0cd07745e365/volumes" Dec 05 12:47:07 crc kubenswrapper[4763]: I1205 12:47:07.544064 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:47:07 crc kubenswrapper[4763]: I1205 12:47:07.544559 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:47:07 crc kubenswrapper[4763]: I1205 12:47:07.544626 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:47:07 crc kubenswrapper[4763]: I1205 12:47:07.545618 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:47:07 crc kubenswrapper[4763]: I1205 12:47:07.545692 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a" gracePeriod=600 Dec 05 12:47:08 crc kubenswrapper[4763]: I1205 12:47:08.042836 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a" exitCode=0 Dec 05 12:47:08 crc kubenswrapper[4763]: I1205 12:47:08.042922 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a"} Dec 05 12:47:08 crc kubenswrapper[4763]: I1205 12:47:08.043215 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667"} Dec 05 12:47:08 crc kubenswrapper[4763]: I1205 12:47:08.043243 4763 scope.go:117] "RemoveContainer" containerID="076a320d83ff63935b6e470f29dca4488cd7427d5318bd6be5b9f80791b1e029" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.069851 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:20 crc kubenswrapper[4763]: E1205 12:47:20.070976 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="extract-content" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.070996 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="extract-content" Dec 05 12:47:20 crc kubenswrapper[4763]: E1205 12:47:20.071027 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="registry-server" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.071034 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="registry-server" Dec 05 12:47:20 crc kubenswrapper[4763]: E1205 12:47:20.071071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="extract-utilities" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.071079 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="extract-utilities" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.071273 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa9c56b4-dd2f-4227-902d-0cd07745e365" containerName="registry-server" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.072998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.080664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.202780 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbcv\" (UniqueName: \"kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.203504 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.203607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.305728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.305878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.305986 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbcv\" (UniqueName: \"kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.306493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.306555 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.326522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbcv\" (UniqueName: \"kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv\") pod \"certified-operators-zrkcj\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.404521 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:20 crc kubenswrapper[4763]: I1205 12:47:20.951298 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:21 crc kubenswrapper[4763]: I1205 12:47:21.177349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerStarted","Data":"f2a43932d097528e196d15294b15b1120a448784d91486b0738995b0ed19b2e3"} Dec 05 12:47:22 crc kubenswrapper[4763]: I1205 12:47:22.189561 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerID="16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf" exitCode=0 Dec 05 12:47:22 crc kubenswrapper[4763]: I1205 12:47:22.189732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerDied","Data":"16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf"} Dec 05 12:47:24 crc kubenswrapper[4763]: I1205 12:47:24.212791 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerStarted","Data":"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8"} Dec 05 12:47:25 crc kubenswrapper[4763]: I1205 12:47:25.224847 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerID="ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8" exitCode=0 Dec 05 12:47:25 crc kubenswrapper[4763]: I1205 12:47:25.224943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerDied","Data":"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8"} Dec 05 12:47:26 crc kubenswrapper[4763]: I1205 12:47:26.236880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerStarted","Data":"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156"} Dec 05 12:47:26 crc kubenswrapper[4763]: I1205 12:47:26.268614 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrkcj" podStartSLOduration=2.7943605270000003 podStartE2EDuration="6.268586638s" podCreationTimestamp="2025-12-05 12:47:20 +0000 UTC" firstStartedPulling="2025-12-05 12:47:22.192431697 +0000 UTC m=+3526.685146420" lastFinishedPulling="2025-12-05 12:47:25.666657788 +0000 UTC m=+3530.159372531" observedRunningTime="2025-12-05 12:47:26.259204306 +0000 UTC m=+3530.751919039" watchObservedRunningTime="2025-12-05 12:47:26.268586638 +0000 UTC m=+3530.761301361" Dec 05 12:47:30 crc kubenswrapper[4763]: I1205 12:47:30.404818 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:30 crc kubenswrapper[4763]: I1205 12:47:30.405261 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:30 crc kubenswrapper[4763]: I1205 12:47:30.452936 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:31 crc kubenswrapper[4763]: I1205 12:47:31.344936 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:31 crc kubenswrapper[4763]: I1205 12:47:31.408003 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.309556 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zrkcj" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="registry-server" containerID="cri-o://3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156" gracePeriod=2 Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.845168 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.888110 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content\") pod \"f1f0f662-5db8-4656-98ae-10455e2632f2\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.888187 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jbcv\" (UniqueName: \"kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv\") pod \"f1f0f662-5db8-4656-98ae-10455e2632f2\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.888357 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities\") pod \"f1f0f662-5db8-4656-98ae-10455e2632f2\" (UID: \"f1f0f662-5db8-4656-98ae-10455e2632f2\") " Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.889547 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities" (OuterVolumeSpecName: "utilities") pod "f1f0f662-5db8-4656-98ae-10455e2632f2" (UID: "f1f0f662-5db8-4656-98ae-10455e2632f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.899081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv" (OuterVolumeSpecName: "kube-api-access-5jbcv") pod "f1f0f662-5db8-4656-98ae-10455e2632f2" (UID: "f1f0f662-5db8-4656-98ae-10455e2632f2"). InnerVolumeSpecName "kube-api-access-5jbcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.944741 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1f0f662-5db8-4656-98ae-10455e2632f2" (UID: "f1f0f662-5db8-4656-98ae-10455e2632f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.990421 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jbcv\" (UniqueName: \"kubernetes.io/projected/f1f0f662-5db8-4656-98ae-10455e2632f2-kube-api-access-5jbcv\") on node \"crc\" DevicePath \"\"" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.990463 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:47:33 crc kubenswrapper[4763]: I1205 12:47:33.990510 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f0f662-5db8-4656-98ae-10455e2632f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.320300 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerID="3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156" exitCode=0 Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.320354 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerDied","Data":"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156"} Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.320399 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrkcj" event={"ID":"f1f0f662-5db8-4656-98ae-10455e2632f2","Type":"ContainerDied","Data":"f2a43932d097528e196d15294b15b1120a448784d91486b0738995b0ed19b2e3"} Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.320422 4763 scope.go:117] "RemoveContainer" containerID="3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.320421 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrkcj" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.343891 4763 scope.go:117] "RemoveContainer" containerID="ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.367462 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.371277 4763 scope.go:117] "RemoveContainer" containerID="16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.386906 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zrkcj"] Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.417947 4763 scope.go:117] "RemoveContainer" containerID="3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156" Dec 05 12:47:34 crc kubenswrapper[4763]: E1205 12:47:34.418465 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156\": container with ID starting with 3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156 not found: ID does not exist" containerID="3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.418529 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156"} err="failed to get container status \"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156\": rpc error: code = NotFound desc = could not find container \"3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156\": container with ID starting with 3100f5c38478d5daa599723e34f87c58f3ec4b3ea833c0ca8953c48b72d96156 not found: ID does not exist" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.418566 4763 scope.go:117] "RemoveContainer" containerID="ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8" Dec 05 12:47:34 crc kubenswrapper[4763]: E1205 12:47:34.418901 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8\": container with ID starting with ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8 not found: ID does not exist" containerID="ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.418927 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8"} err="failed to get container status \"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8\": rpc error: code = NotFound desc = could not find container \"ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8\": container with ID starting with ac8f0d60a9e36133cf1534a3da908fe5c2b4acffc8be268c1f0522242a1157f8 not found: ID does not exist" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.418942 4763 scope.go:117] "RemoveContainer" containerID="16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf" Dec 05 12:47:34 crc kubenswrapper[4763]: E1205 12:47:34.419149 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf\": container with ID starting with 16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf not found: ID does not exist" containerID="16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf" Dec 05 12:47:34 crc kubenswrapper[4763]: I1205 12:47:34.419172 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf"} err="failed to get container status \"16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf\": rpc error: code = NotFound desc = could not find container \"16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf\": container with ID starting with 16da06e30915b3b7e5f107c323e0400b456e6473d0fd7844822380188a49d2cf not found: ID does not exist" Dec 05 12:47:35 crc kubenswrapper[4763]: I1205 12:47:35.797128 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" path="/var/lib/kubelet/pods/f1f0f662-5db8-4656-98ae-10455e2632f2/volumes" Dec 05 12:49:07 crc kubenswrapper[4763]: I1205 12:49:07.544205 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:49:07 crc kubenswrapper[4763]: I1205 12:49:07.544902 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:49:37 crc kubenswrapper[4763]: I1205 12:49:37.544420 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:49:37 crc kubenswrapper[4763]: I1205 12:49:37.545942 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.545164 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.546292 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.546373 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.547699 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.547907 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" gracePeriod=600 Dec 05 12:50:07 crc kubenswrapper[4763]: E1205 12:50:07.668487 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.912574 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" exitCode=0 Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.912632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667"} Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.912695 4763 scope.go:117] "RemoveContainer" containerID="7fc5f0a4563d9fc623fe89d2790ce410750336c0faa3cea1c07e236f6e78e33a" Dec 05 12:50:07 crc kubenswrapper[4763]: I1205 12:50:07.913347 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:50:07 crc kubenswrapper[4763]: E1205 12:50:07.913607 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:50:19 crc kubenswrapper[4763]: I1205 12:50:19.784445 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:50:19 crc kubenswrapper[4763]: E1205 12:50:19.786368 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:50:34 crc kubenswrapper[4763]: I1205 12:50:34.784260 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:50:34 crc kubenswrapper[4763]: E1205 12:50:34.785056 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:50:47 crc kubenswrapper[4763]: I1205 12:50:47.783928 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:50:47 crc kubenswrapper[4763]: E1205 12:50:47.784728 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:51:00 crc kubenswrapper[4763]: I1205 12:51:00.785114 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:51:00 crc kubenswrapper[4763]: E1205 12:51:00.786334 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:51:14 crc kubenswrapper[4763]: I1205 12:51:14.785584 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:51:14 crc kubenswrapper[4763]: E1205 12:51:14.786577 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:51:29 crc kubenswrapper[4763]: I1205 12:51:29.785299 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:51:29 crc kubenswrapper[4763]: E1205 12:51:29.786639 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:51:42 crc kubenswrapper[4763]: I1205 12:51:42.784399 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:51:42 crc kubenswrapper[4763]: E1205 12:51:42.785365 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:51:53 crc kubenswrapper[4763]: I1205 12:51:53.784721 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:51:53 crc kubenswrapper[4763]: E1205 12:51:53.785802 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:52:07 crc kubenswrapper[4763]: I1205 12:52:07.794152 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:52:07 crc kubenswrapper[4763]: E1205 12:52:07.795001 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:52:18 crc kubenswrapper[4763]: I1205 12:52:18.783576 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:52:18 crc kubenswrapper[4763]: E1205 12:52:18.784390 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:52:33 crc kubenswrapper[4763]: I1205 12:52:33.784280 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:52:33 crc kubenswrapper[4763]: E1205 12:52:33.785445 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:52:46 crc kubenswrapper[4763]: I1205 12:52:46.784612 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:52:46 crc kubenswrapper[4763]: E1205 12:52:46.785536 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:52:59 crc kubenswrapper[4763]: I1205 12:52:59.784726 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:52:59 crc kubenswrapper[4763]: E1205 12:52:59.785527 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:53:10 crc kubenswrapper[4763]: I1205 12:53:10.784891 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:53:10 crc kubenswrapper[4763]: E1205 12:53:10.785641 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:53:22 crc kubenswrapper[4763]: I1205 12:53:22.783958 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:53:22 crc kubenswrapper[4763]: E1205 12:53:22.784883 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.892033 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:32 crc kubenswrapper[4763]: E1205 12:53:32.893086 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="extract-utilities" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.893104 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="extract-utilities" Dec 05 12:53:32 crc kubenswrapper[4763]: E1205 12:53:32.893144 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="registry-server" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.893152 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="registry-server" Dec 05 12:53:32 crc kubenswrapper[4763]: E1205 12:53:32.893202 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="extract-content" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.893211 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="extract-content" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.893472 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f0f662-5db8-4656-98ae-10455e2632f2" containerName="registry-server" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.895391 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:32 crc kubenswrapper[4763]: I1205 12:53:32.919834 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.008639 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.008734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.008803 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nphds\" (UniqueName: \"kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.111645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.112221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.112262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nphds\" (UniqueName: \"kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.112573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.112810 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.135315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nphds\" (UniqueName: \"kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds\") pod \"redhat-operators-69dc8\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.232779 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:33 crc kubenswrapper[4763]: I1205 12:53:33.731991 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:34 crc kubenswrapper[4763]: I1205 12:53:34.240080 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerID="24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8" exitCode=0 Dec 05 12:53:34 crc kubenswrapper[4763]: I1205 12:53:34.240418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerDied","Data":"24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8"} Dec 05 12:53:34 crc kubenswrapper[4763]: I1205 12:53:34.240925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerStarted","Data":"f18c2c7b4095a3b3a76964cfc940ffde4964c8d90bc71da2031b7984a4e66415"} Dec 05 12:53:34 crc kubenswrapper[4763]: I1205 12:53:34.242400 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:53:35 crc kubenswrapper[4763]: I1205 12:53:35.258277 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerStarted","Data":"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd"} Dec 05 12:53:37 crc kubenswrapper[4763]: I1205 12:53:37.784163 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:53:37 crc kubenswrapper[4763]: E1205 12:53:37.785099 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:53:40 crc kubenswrapper[4763]: I1205 12:53:40.337121 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerID="e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd" exitCode=0 Dec 05 12:53:40 crc kubenswrapper[4763]: I1205 12:53:40.337171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerDied","Data":"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd"} Dec 05 12:53:41 crc kubenswrapper[4763]: I1205 12:53:41.349850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerStarted","Data":"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b"} Dec 05 12:53:41 crc kubenswrapper[4763]: I1205 12:53:41.370848 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-69dc8" podStartSLOduration=2.827694907 podStartE2EDuration="9.370826757s" podCreationTimestamp="2025-12-05 12:53:32 +0000 UTC" firstStartedPulling="2025-12-05 12:53:34.242164956 +0000 UTC m=+3898.734879679" lastFinishedPulling="2025-12-05 12:53:40.785296806 +0000 UTC m=+3905.278011529" observedRunningTime="2025-12-05 12:53:41.364924248 +0000 UTC m=+3905.857638981" watchObservedRunningTime="2025-12-05 12:53:41.370826757 +0000 UTC m=+3905.863541480" Dec 05 12:53:43 crc kubenswrapper[4763]: I1205 12:53:43.233530 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:43 crc kubenswrapper[4763]: I1205 12:53:43.233897 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:44 crc kubenswrapper[4763]: I1205 12:53:44.289354 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-69dc8" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="registry-server" probeResult="failure" output=< Dec 05 12:53:44 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 12:53:44 crc kubenswrapper[4763]: > Dec 05 12:53:48 crc kubenswrapper[4763]: I1205 12:53:48.784197 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:53:48 crc kubenswrapper[4763]: E1205 12:53:48.785658 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:53:53 crc kubenswrapper[4763]: I1205 12:53:53.303314 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:53 crc kubenswrapper[4763]: I1205 12:53:53.359945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:53 crc kubenswrapper[4763]: I1205 12:53:53.546446 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:54 crc kubenswrapper[4763]: I1205 12:53:54.515261 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-69dc8" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="registry-server" containerID="cri-o://667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b" gracePeriod=2 Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.089182 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.198905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities\") pod \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.199040 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content\") pod \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.199090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nphds\" (UniqueName: \"kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds\") pod \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\" (UID: \"b1d6ca3a-c008-4564-aec2-0c697e55e1df\") " Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.199915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities" (OuterVolumeSpecName: "utilities") pod "b1d6ca3a-c008-4564-aec2-0c697e55e1df" (UID: "b1d6ca3a-c008-4564-aec2-0c697e55e1df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.221465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds" (OuterVolumeSpecName: "kube-api-access-nphds") pod "b1d6ca3a-c008-4564-aec2-0c697e55e1df" (UID: "b1d6ca3a-c008-4564-aec2-0c697e55e1df"). InnerVolumeSpecName "kube-api-access-nphds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.302046 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.302506 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nphds\" (UniqueName: \"kubernetes.io/projected/b1d6ca3a-c008-4564-aec2-0c697e55e1df-kube-api-access-nphds\") on node \"crc\" DevicePath \"\"" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.366213 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1d6ca3a-c008-4564-aec2-0c697e55e1df" (UID: "b1d6ca3a-c008-4564-aec2-0c697e55e1df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.405562 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d6ca3a-c008-4564-aec2-0c697e55e1df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.529115 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerID="667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b" exitCode=0 Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.529243 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69dc8" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.529268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerDied","Data":"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b"} Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.530491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69dc8" event={"ID":"b1d6ca3a-c008-4564-aec2-0c697e55e1df","Type":"ContainerDied","Data":"f18c2c7b4095a3b3a76964cfc940ffde4964c8d90bc71da2031b7984a4e66415"} Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.530542 4763 scope.go:117] "RemoveContainer" containerID="667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.565542 4763 scope.go:117] "RemoveContainer" containerID="e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.583775 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.597542 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-69dc8"] Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.612210 4763 scope.go:117] "RemoveContainer" containerID="24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.653418 4763 scope.go:117] "RemoveContainer" containerID="667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b" Dec 05 12:53:55 crc kubenswrapper[4763]: E1205 12:53:55.654235 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b\": container with ID starting with 667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b not found: ID does not exist" containerID="667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.654295 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b"} err="failed to get container status \"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b\": rpc error: code = NotFound desc = could not find container \"667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b\": container with ID starting with 667bab3bb6be10fcd7d4ce14895e399994b0a6868e97633f81af72844f36f00b not found: ID does not exist" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.654329 4763 scope.go:117] "RemoveContainer" containerID="e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd" Dec 05 12:53:55 crc kubenswrapper[4763]: E1205 12:53:55.654987 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd\": container with ID starting with e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd not found: ID does not exist" containerID="e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.655033 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd"} err="failed to get container status \"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd\": rpc error: code = NotFound desc = could not find container \"e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd\": container with ID starting with e53ba0d304943e7b20ea656a59f2e1a680bbd9fba970e0f3991a364e1b0a8afd not found: ID does not exist" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.655089 4763 scope.go:117] "RemoveContainer" containerID="24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8" Dec 05 12:53:55 crc kubenswrapper[4763]: E1205 12:53:55.655644 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8\": container with ID starting with 24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8 not found: ID does not exist" containerID="24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.655691 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8"} err="failed to get container status \"24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8\": rpc error: code = NotFound desc = could not find container \"24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8\": container with ID starting with 24f3597fe70f7a765123a65b9aadb8cc6859ea17aef624eb2ef81bdf789335a8 not found: ID does not exist" Dec 05 12:53:55 crc kubenswrapper[4763]: I1205 12:53:55.798245 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" path="/var/lib/kubelet/pods/b1d6ca3a-c008-4564-aec2-0c697e55e1df/volumes" Dec 05 12:53:59 crc kubenswrapper[4763]: I1205 12:53:59.785010 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:53:59 crc kubenswrapper[4763]: E1205 12:53:59.786214 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:54:12 crc kubenswrapper[4763]: I1205 12:54:12.783965 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:54:12 crc kubenswrapper[4763]: E1205 12:54:12.784846 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:54:24 crc kubenswrapper[4763]: I1205 12:54:24.784484 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:54:24 crc kubenswrapper[4763]: E1205 12:54:24.785531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:54:39 crc kubenswrapper[4763]: I1205 12:54:39.783923 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:54:39 crc kubenswrapper[4763]: E1205 12:54:39.784687 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:54:53 crc kubenswrapper[4763]: I1205 12:54:53.784695 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:54:53 crc kubenswrapper[4763]: E1205 12:54:53.785651 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:55:04 crc kubenswrapper[4763]: I1205 12:55:04.784407 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:55:04 crc kubenswrapper[4763]: E1205 12:55:04.785234 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 12:55:18 crc kubenswrapper[4763]: I1205 12:55:18.784948 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:55:19 crc kubenswrapper[4763]: I1205 12:55:19.469286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8"} Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.510394 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:56:59 crc kubenswrapper[4763]: E1205 12:56:59.511750 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="extract-utilities" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.511795 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="extract-utilities" Dec 05 12:56:59 crc kubenswrapper[4763]: E1205 12:56:59.511819 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="registry-server" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.511830 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="registry-server" Dec 05 12:56:59 crc kubenswrapper[4763]: E1205 12:56:59.511850 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="extract-content" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.511862 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="extract-content" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.512214 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d6ca3a-c008-4564-aec2-0c697e55e1df" containerName="registry-server" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.517927 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.523611 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.591970 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.592056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgt8x\" (UniqueName: \"kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.592101 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.694058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.694122 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgt8x\" (UniqueName: \"kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.694154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.694784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.694854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.714971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgt8x\" (UniqueName: \"kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x\") pod \"community-operators-jcgz4\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:56:59 crc kubenswrapper[4763]: I1205 12:56:59.849687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:00 crc kubenswrapper[4763]: I1205 12:57:00.365875 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:57:00 crc kubenswrapper[4763]: I1205 12:57:00.550106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerStarted","Data":"ea8790f9d48c47edad5827073667358d02fde9ce748ee52c7a68f5535abfea15"} Dec 05 12:57:01 crc kubenswrapper[4763]: I1205 12:57:01.560471 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerID="f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d" exitCode=0 Dec 05 12:57:01 crc kubenswrapper[4763]: I1205 12:57:01.560562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerDied","Data":"f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d"} Dec 05 12:57:04 crc kubenswrapper[4763]: I1205 12:57:04.593511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerStarted","Data":"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0"} Dec 05 12:57:05 crc kubenswrapper[4763]: I1205 12:57:05.605756 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerID="183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0" exitCode=0 Dec 05 12:57:05 crc kubenswrapper[4763]: I1205 12:57:05.605821 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerDied","Data":"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0"} Dec 05 12:57:07 crc kubenswrapper[4763]: I1205 12:57:07.635136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerStarted","Data":"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa"} Dec 05 12:57:07 crc kubenswrapper[4763]: I1205 12:57:07.663581 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jcgz4" podStartSLOduration=4.739211146 podStartE2EDuration="8.663565645s" podCreationTimestamp="2025-12-05 12:56:59 +0000 UTC" firstStartedPulling="2025-12-05 12:57:02.571824541 +0000 UTC m=+4107.064539264" lastFinishedPulling="2025-12-05 12:57:06.49617904 +0000 UTC m=+4110.988893763" observedRunningTime="2025-12-05 12:57:07.660283307 +0000 UTC m=+4112.152998020" watchObservedRunningTime="2025-12-05 12:57:07.663565645 +0000 UTC m=+4112.156280378" Dec 05 12:57:09 crc kubenswrapper[4763]: I1205 12:57:09.850103 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:09 crc kubenswrapper[4763]: I1205 12:57:09.850441 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:09 crc kubenswrapper[4763]: I1205 12:57:09.902046 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.774890 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.778079 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.794752 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.824982 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.825455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.825698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f786d\" (UniqueName: \"kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.928342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.928392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.928423 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f786d\" (UniqueName: \"kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.928940 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.929014 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:15 crc kubenswrapper[4763]: I1205 12:57:15.947903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f786d\" (UniqueName: \"kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d\") pod \"redhat-marketplace-r79rv\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:16 crc kubenswrapper[4763]: I1205 12:57:16.112153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:17 crc kubenswrapper[4763]: I1205 12:57:17.369692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:17 crc kubenswrapper[4763]: I1205 12:57:17.754330 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerID="8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e" exitCode=0 Dec 05 12:57:17 crc kubenswrapper[4763]: I1205 12:57:17.754378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerDied","Data":"8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e"} Dec 05 12:57:17 crc kubenswrapper[4763]: I1205 12:57:17.754406 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerStarted","Data":"e3731bb2cc40f5c90ec41539a841802686c7e360a1889d922d8d9feca116f29d"} Dec 05 12:57:19 crc kubenswrapper[4763]: I1205 12:57:19.776922 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerID="210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e" exitCode=0 Dec 05 12:57:19 crc kubenswrapper[4763]: I1205 12:57:19.777022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerDied","Data":"210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e"} Dec 05 12:57:19 crc kubenswrapper[4763]: I1205 12:57:19.904153 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:20 crc kubenswrapper[4763]: I1205 12:57:20.788411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerStarted","Data":"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804"} Dec 05 12:57:20 crc kubenswrapper[4763]: I1205 12:57:20.817540 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r79rv" podStartSLOduration=3.4055072920000002 podStartE2EDuration="5.817516094s" podCreationTimestamp="2025-12-05 12:57:15 +0000 UTC" firstStartedPulling="2025-12-05 12:57:17.756608049 +0000 UTC m=+4122.249322772" lastFinishedPulling="2025-12-05 12:57:20.168616831 +0000 UTC m=+4124.661331574" observedRunningTime="2025-12-05 12:57:20.815329245 +0000 UTC m=+4125.308043978" watchObservedRunningTime="2025-12-05 12:57:20.817516094 +0000 UTC m=+4125.310230817" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.175468 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.176251 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jcgz4" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="registry-server" containerID="cri-o://10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa" gracePeriod=2 Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.771210 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.823476 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerID="10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa" exitCode=0 Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.823657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerDied","Data":"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa"} Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.823676 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcgz4" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.823853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcgz4" event={"ID":"2ce26da1-6899-4082-a4ae-dc8491b70736","Type":"ContainerDied","Data":"ea8790f9d48c47edad5827073667358d02fde9ce748ee52c7a68f5535abfea15"} Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.823878 4763 scope.go:117] "RemoveContainer" containerID="10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.855399 4763 scope.go:117] "RemoveContainer" containerID="183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.898385 4763 scope.go:117] "RemoveContainer" containerID="f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.915417 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content\") pod \"2ce26da1-6899-4082-a4ae-dc8491b70736\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.915630 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgt8x\" (UniqueName: \"kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x\") pod \"2ce26da1-6899-4082-a4ae-dc8491b70736\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.915673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities\") pod \"2ce26da1-6899-4082-a4ae-dc8491b70736\" (UID: \"2ce26da1-6899-4082-a4ae-dc8491b70736\") " Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.916473 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities" (OuterVolumeSpecName: "utilities") pod "2ce26da1-6899-4082-a4ae-dc8491b70736" (UID: "2ce26da1-6899-4082-a4ae-dc8491b70736"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.923385 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x" (OuterVolumeSpecName: "kube-api-access-xgt8x") pod "2ce26da1-6899-4082-a4ae-dc8491b70736" (UID: "2ce26da1-6899-4082-a4ae-dc8491b70736"). InnerVolumeSpecName "kube-api-access-xgt8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.928728 4763 scope.go:117] "RemoveContainer" containerID="10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa" Dec 05 12:57:22 crc kubenswrapper[4763]: E1205 12:57:22.929195 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa\": container with ID starting with 10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa not found: ID does not exist" containerID="10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.929250 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa"} err="failed to get container status \"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa\": rpc error: code = NotFound desc = could not find container \"10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa\": container with ID starting with 10522c7eea5b2f7a77d3c72729cec3fca03b434e41db116dee188364d725b3aa not found: ID does not exist" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.929285 4763 scope.go:117] "RemoveContainer" containerID="183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0" Dec 05 12:57:22 crc kubenswrapper[4763]: E1205 12:57:22.929853 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0\": container with ID starting with 183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0 not found: ID does not exist" containerID="183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.929888 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0"} err="failed to get container status \"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0\": rpc error: code = NotFound desc = could not find container \"183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0\": container with ID starting with 183aa23b2a33002502aa0b1bd8ba7741c1bfc305b22188afb6caa1cc427ec6c0 not found: ID does not exist" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.929908 4763 scope.go:117] "RemoveContainer" containerID="f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d" Dec 05 12:57:22 crc kubenswrapper[4763]: E1205 12:57:22.930386 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d\": container with ID starting with f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d not found: ID does not exist" containerID="f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.930441 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d"} err="failed to get container status \"f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d\": rpc error: code = NotFound desc = could not find container \"f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d\": container with ID starting with f29bc82758239beabc2639f2988fe52a3faedc1d2cca8284f01c121cf0dacc8d not found: ID does not exist" Dec 05 12:57:22 crc kubenswrapper[4763]: I1205 12:57:22.972371 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ce26da1-6899-4082-a4ae-dc8491b70736" (UID: "2ce26da1-6899-4082-a4ae-dc8491b70736"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.018087 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgt8x\" (UniqueName: \"kubernetes.io/projected/2ce26da1-6899-4082-a4ae-dc8491b70736-kube-api-access-xgt8x\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.018132 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.018141 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce26da1-6899-4082-a4ae-dc8491b70736-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.162410 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.171033 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jcgz4"] Dec 05 12:57:23 crc kubenswrapper[4763]: I1205 12:57:23.795735 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" path="/var/lib/kubelet/pods/2ce26da1-6899-4082-a4ae-dc8491b70736/volumes" Dec 05 12:57:26 crc kubenswrapper[4763]: I1205 12:57:26.113124 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:26 crc kubenswrapper[4763]: I1205 12:57:26.113186 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:26 crc kubenswrapper[4763]: I1205 12:57:26.160862 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:26 crc kubenswrapper[4763]: I1205 12:57:26.958939 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:27 crc kubenswrapper[4763]: I1205 12:57:27.571483 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:28 crc kubenswrapper[4763]: I1205 12:57:28.885102 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r79rv" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="registry-server" containerID="cri-o://aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804" gracePeriod=2 Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.484353 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.658429 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content\") pod \"0ead2a34-5bdd-4ec4-961a-009356417a8c\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.658574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f786d\" (UniqueName: \"kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d\") pod \"0ead2a34-5bdd-4ec4-961a-009356417a8c\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.658641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities\") pod \"0ead2a34-5bdd-4ec4-961a-009356417a8c\" (UID: \"0ead2a34-5bdd-4ec4-961a-009356417a8c\") " Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.660324 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities" (OuterVolumeSpecName: "utilities") pod "0ead2a34-5bdd-4ec4-961a-009356417a8c" (UID: "0ead2a34-5bdd-4ec4-961a-009356417a8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.664167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d" (OuterVolumeSpecName: "kube-api-access-f786d") pod "0ead2a34-5bdd-4ec4-961a-009356417a8c" (UID: "0ead2a34-5bdd-4ec4-961a-009356417a8c"). InnerVolumeSpecName "kube-api-access-f786d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.677771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ead2a34-5bdd-4ec4-961a-009356417a8c" (UID: "0ead2a34-5bdd-4ec4-961a-009356417a8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.760912 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.760962 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f786d\" (UniqueName: \"kubernetes.io/projected/0ead2a34-5bdd-4ec4-961a-009356417a8c-kube-api-access-f786d\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.760975 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ead2a34-5bdd-4ec4-961a-009356417a8c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.896508 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerID="aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804" exitCode=0 Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.896565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerDied","Data":"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804"} Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.896591 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r79rv" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.896616 4763 scope.go:117] "RemoveContainer" containerID="aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.896601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r79rv" event={"ID":"0ead2a34-5bdd-4ec4-961a-009356417a8c","Type":"ContainerDied","Data":"e3731bb2cc40f5c90ec41539a841802686c7e360a1889d922d8d9feca116f29d"} Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.933841 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.938710 4763 scope.go:117] "RemoveContainer" containerID="210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e" Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.946247 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r79rv"] Dec 05 12:57:29 crc kubenswrapper[4763]: I1205 12:57:29.970847 4763 scope.go:117] "RemoveContainer" containerID="8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.529500 4763 scope.go:117] "RemoveContainer" containerID="aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804" Dec 05 12:57:30 crc kubenswrapper[4763]: E1205 12:57:30.530230 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804\": container with ID starting with aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804 not found: ID does not exist" containerID="aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.530284 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804"} err="failed to get container status \"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804\": rpc error: code = NotFound desc = could not find container \"aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804\": container with ID starting with aa0c9184dc5ece2ad3e43d8803b86e1662f83b70a645f125ff9e2a4f4085b804 not found: ID does not exist" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.530322 4763 scope.go:117] "RemoveContainer" containerID="210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e" Dec 05 12:57:30 crc kubenswrapper[4763]: E1205 12:57:30.530703 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e\": container with ID starting with 210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e not found: ID does not exist" containerID="210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.530744 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e"} err="failed to get container status \"210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e\": rpc error: code = NotFound desc = could not find container \"210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e\": container with ID starting with 210c45b463240c04154c6ba47e355fea6f5f7a7a49f4825a7fa36918df1b345e not found: ID does not exist" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.530831 4763 scope.go:117] "RemoveContainer" containerID="8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e" Dec 05 12:57:30 crc kubenswrapper[4763]: E1205 12:57:30.531169 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e\": container with ID starting with 8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e not found: ID does not exist" containerID="8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e" Dec 05 12:57:30 crc kubenswrapper[4763]: I1205 12:57:30.531209 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e"} err="failed to get container status \"8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e\": rpc error: code = NotFound desc = could not find container \"8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e\": container with ID starting with 8d5eb4aea868f393f8b4fadfac153d299bb17b5dc772db78185b04a27c4b907e not found: ID does not exist" Dec 05 12:57:31 crc kubenswrapper[4763]: I1205 12:57:31.800397 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" path="/var/lib/kubelet/pods/0ead2a34-5bdd-4ec4-961a-009356417a8c/volumes" Dec 05 12:57:37 crc kubenswrapper[4763]: I1205 12:57:37.544325 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:57:37 crc kubenswrapper[4763]: I1205 12:57:37.544972 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.988461 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989741 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="extract-utilities" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989789 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="extract-utilities" Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989825 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="extract-content" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989835 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="extract-content" Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989861 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="extract-utilities" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989871 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="extract-utilities" Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989915 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989930 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="extract-content" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989939 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="extract-content" Dec 05 12:57:51 crc kubenswrapper[4763]: E1205 12:57:51.989963 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.989973 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.990272 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce26da1-6899-4082-a4ae-dc8491b70736" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.990298 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ead2a34-5bdd-4ec4-961a-009356417a8c" containerName="registry-server" Dec 05 12:57:51 crc kubenswrapper[4763]: I1205 12:57:51.992607 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.004506 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.122594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4zm\" (UniqueName: \"kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.122667 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.122690 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.225184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4zm\" (UniqueName: \"kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.225247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.225274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.226009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.226001 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.249581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4zm\" (UniqueName: \"kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm\") pod \"certified-operators-q9pr7\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.324348 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:57:52 crc kubenswrapper[4763]: I1205 12:57:52.914060 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:57:54 crc kubenswrapper[4763]: I1205 12:57:54.168981 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab269999-672c-4d37-8eb7-613306895437" containerID="87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177" exitCode=0 Dec 05 12:57:54 crc kubenswrapper[4763]: I1205 12:57:54.169029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerDied","Data":"87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177"} Dec 05 12:57:54 crc kubenswrapper[4763]: I1205 12:57:54.170261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerStarted","Data":"d773169d33bd098c7c122650c0c27d0ebd57c90d6880d9823df814d1ebaa986c"} Dec 05 12:57:56 crc kubenswrapper[4763]: I1205 12:57:56.201674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerStarted","Data":"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d"} Dec 05 12:57:57 crc kubenswrapper[4763]: I1205 12:57:57.218015 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab269999-672c-4d37-8eb7-613306895437" containerID="18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d" exitCode=0 Dec 05 12:57:57 crc kubenswrapper[4763]: I1205 12:57:57.218161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerDied","Data":"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d"} Dec 05 12:58:03 crc kubenswrapper[4763]: I1205 12:58:03.287188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerStarted","Data":"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f"} Dec 05 12:58:03 crc kubenswrapper[4763]: I1205 12:58:03.322053 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q9pr7" podStartSLOduration=4.781363825 podStartE2EDuration="12.322025386s" podCreationTimestamp="2025-12-05 12:57:51 +0000 UTC" firstStartedPulling="2025-12-05 12:57:54.171456006 +0000 UTC m=+4158.664170729" lastFinishedPulling="2025-12-05 12:58:01.712117567 +0000 UTC m=+4166.204832290" observedRunningTime="2025-12-05 12:58:03.313725543 +0000 UTC m=+4167.806440286" watchObservedRunningTime="2025-12-05 12:58:03.322025386 +0000 UTC m=+4167.814740149" Dec 05 12:58:07 crc kubenswrapper[4763]: I1205 12:58:07.544031 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:58:07 crc kubenswrapper[4763]: I1205 12:58:07.544744 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:58:12 crc kubenswrapper[4763]: I1205 12:58:12.324878 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:12 crc kubenswrapper[4763]: I1205 12:58:12.325662 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:12 crc kubenswrapper[4763]: I1205 12:58:12.408818 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:12 crc kubenswrapper[4763]: I1205 12:58:12.472502 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:12 crc kubenswrapper[4763]: I1205 12:58:12.666624 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:58:14 crc kubenswrapper[4763]: I1205 12:58:14.423741 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q9pr7" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="registry-server" containerID="cri-o://1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f" gracePeriod=2 Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.038311 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.139794 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities\") pod \"ab269999-672c-4d37-8eb7-613306895437\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.139849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content\") pod \"ab269999-672c-4d37-8eb7-613306895437\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.140117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw4zm\" (UniqueName: \"kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm\") pod \"ab269999-672c-4d37-8eb7-613306895437\" (UID: \"ab269999-672c-4d37-8eb7-613306895437\") " Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.141043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities" (OuterVolumeSpecName: "utilities") pod "ab269999-672c-4d37-8eb7-613306895437" (UID: "ab269999-672c-4d37-8eb7-613306895437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.145862 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm" (OuterVolumeSpecName: "kube-api-access-zw4zm") pod "ab269999-672c-4d37-8eb7-613306895437" (UID: "ab269999-672c-4d37-8eb7-613306895437"). InnerVolumeSpecName "kube-api-access-zw4zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.189419 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab269999-672c-4d37-8eb7-613306895437" (UID: "ab269999-672c-4d37-8eb7-613306895437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.242516 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw4zm\" (UniqueName: \"kubernetes.io/projected/ab269999-672c-4d37-8eb7-613306895437-kube-api-access-zw4zm\") on node \"crc\" DevicePath \"\"" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.242549 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.242557 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab269999-672c-4d37-8eb7-613306895437-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.445044 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab269999-672c-4d37-8eb7-613306895437" containerID="1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f" exitCode=0 Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.445085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerDied","Data":"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f"} Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.445112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9pr7" event={"ID":"ab269999-672c-4d37-8eb7-613306895437","Type":"ContainerDied","Data":"d773169d33bd098c7c122650c0c27d0ebd57c90d6880d9823df814d1ebaa986c"} Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.445168 4763 scope.go:117] "RemoveContainer" containerID="1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.445192 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9pr7" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.474980 4763 scope.go:117] "RemoveContainer" containerID="18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.500121 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.517330 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q9pr7"] Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.518072 4763 scope.go:117] "RemoveContainer" containerID="87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.588306 4763 scope.go:117] "RemoveContainer" containerID="1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f" Dec 05 12:58:15 crc kubenswrapper[4763]: E1205 12:58:15.588800 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f\": container with ID starting with 1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f not found: ID does not exist" containerID="1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.588868 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f"} err="failed to get container status \"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f\": rpc error: code = NotFound desc = could not find container \"1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f\": container with ID starting with 1ac7d5d5cf9e0464af6781685e4eaef3530a5cf9953154660bed2b45d9989e2f not found: ID does not exist" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.588895 4763 scope.go:117] "RemoveContainer" containerID="18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d" Dec 05 12:58:15 crc kubenswrapper[4763]: E1205 12:58:15.589291 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d\": container with ID starting with 18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d not found: ID does not exist" containerID="18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.589333 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d"} err="failed to get container status \"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d\": rpc error: code = NotFound desc = could not find container \"18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d\": container with ID starting with 18f3622a29c422efa84ca7c4698e7d64f794bf75028431f24854b75b9646807d not found: ID does not exist" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.589358 4763 scope.go:117] "RemoveContainer" containerID="87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177" Dec 05 12:58:15 crc kubenswrapper[4763]: E1205 12:58:15.589690 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177\": container with ID starting with 87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177 not found: ID does not exist" containerID="87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.589714 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177"} err="failed to get container status \"87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177\": rpc error: code = NotFound desc = could not find container \"87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177\": container with ID starting with 87a5f170f245fe6c5bc19d5a4c5eda1bd743802bd97cb413e033e5e5d15c4177 not found: ID does not exist" Dec 05 12:58:15 crc kubenswrapper[4763]: I1205 12:58:15.801958 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab269999-672c-4d37-8eb7-613306895437" path="/var/lib/kubelet/pods/ab269999-672c-4d37-8eb7-613306895437/volumes" Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.544290 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.545023 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.545096 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.546212 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.546316 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8" gracePeriod=600 Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.705294 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8" exitCode=0 Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.705642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8"} Dec 05 12:58:37 crc kubenswrapper[4763]: I1205 12:58:37.705677 4763 scope.go:117] "RemoveContainer" containerID="b336b650f4dcc0324cdd6db4334f8a03483c43b2443456c36708ae442924d667" Dec 05 12:58:38 crc kubenswrapper[4763]: I1205 12:58:38.722116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4"} Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.193402 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x"] Dec 05 13:00:00 crc kubenswrapper[4763]: E1205 13:00:00.194460 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="extract-utilities" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.194479 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="extract-utilities" Dec 05 13:00:00 crc kubenswrapper[4763]: E1205 13:00:00.194502 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="extract-content" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.194511 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="extract-content" Dec 05 13:00:00 crc kubenswrapper[4763]: E1205 13:00:00.194522 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="registry-server" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.194531 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="registry-server" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.194801 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab269999-672c-4d37-8eb7-613306895437" containerName="registry-server" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.195643 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.197527 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.197920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.206560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x"] Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.288618 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.288905 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbfl\" (UniqueName: \"kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.289062 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.391470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbfl\" (UniqueName: \"kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.391660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.391832 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.393062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.399532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.412984 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbfl\" (UniqueName: \"kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl\") pod \"collect-profiles-29415660-cfb8x\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:00 crc kubenswrapper[4763]: I1205 13:00:00.520917 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:00.998471 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x"] Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:01.604725 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" event={"ID":"2e74603e-9d66-4cc7-aa0f-489fe472d37e","Type":"ContainerStarted","Data":"115c8ed9ab929b41183e0de21d0ddceddd74316b6db85bd52c9fa6fa2ba57659"} Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:03.625053 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" event={"ID":"2e74603e-9d66-4cc7-aa0f-489fe472d37e","Type":"ContainerStarted","Data":"bc82a81c86b108b3c13864123a76713b0249af6d8eb505daeecc62ec3288f9a9"} Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:03.645145 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" podStartSLOduration=3.645127152 podStartE2EDuration="3.645127152s" podCreationTimestamp="2025-12-05 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:00:03.642670256 +0000 UTC m=+4288.135384979" watchObservedRunningTime="2025-12-05 13:00:03.645127152 +0000 UTC m=+4288.137841875" Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:04.634679 4763 generic.go:334] "Generic (PLEG): container finished" podID="2e74603e-9d66-4cc7-aa0f-489fe472d37e" containerID="bc82a81c86b108b3c13864123a76713b0249af6d8eb505daeecc62ec3288f9a9" exitCode=0 Dec 05 13:00:04 crc kubenswrapper[4763]: I1205 13:00:04.634837 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" event={"ID":"2e74603e-9d66-4cc7-aa0f-489fe472d37e","Type":"ContainerDied","Data":"bc82a81c86b108b3c13864123a76713b0249af6d8eb505daeecc62ec3288f9a9"} Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.095604 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.201392 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume\") pod \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.201952 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume\") pod \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.202027 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume" (OuterVolumeSpecName: "config-volume") pod "2e74603e-9d66-4cc7-aa0f-489fe472d37e" (UID: "2e74603e-9d66-4cc7-aa0f-489fe472d37e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.202050 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nbfl\" (UniqueName: \"kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl\") pod \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\" (UID: \"2e74603e-9d66-4cc7-aa0f-489fe472d37e\") " Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.202540 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e74603e-9d66-4cc7-aa0f-489fe472d37e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.209349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl" (OuterVolumeSpecName: "kube-api-access-2nbfl") pod "2e74603e-9d66-4cc7-aa0f-489fe472d37e" (UID: "2e74603e-9d66-4cc7-aa0f-489fe472d37e"). InnerVolumeSpecName "kube-api-access-2nbfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.214833 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2e74603e-9d66-4cc7-aa0f-489fe472d37e" (UID: "2e74603e-9d66-4cc7-aa0f-489fe472d37e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.304535 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e74603e-9d66-4cc7-aa0f-489fe472d37e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.304573 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nbfl\" (UniqueName: \"kubernetes.io/projected/2e74603e-9d66-4cc7-aa0f-489fe472d37e-kube-api-access-2nbfl\") on node \"crc\" DevicePath \"\"" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.660691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" event={"ID":"2e74603e-9d66-4cc7-aa0f-489fe472d37e","Type":"ContainerDied","Data":"115c8ed9ab929b41183e0de21d0ddceddd74316b6db85bd52c9fa6fa2ba57659"} Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.660739 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115c8ed9ab929b41183e0de21d0ddceddd74316b6db85bd52c9fa6fa2ba57659" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.660793 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-cfb8x" Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.725107 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb"] Dec 05 13:00:06 crc kubenswrapper[4763]: I1205 13:00:06.737971 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415615-rh6tb"] Dec 05 13:00:07 crc kubenswrapper[4763]: I1205 13:00:07.799562 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1906b4-d122-45a8-9527-42266fa59f7c" path="/var/lib/kubelet/pods/9b1906b4-d122-45a8-9527-42266fa59f7c/volumes" Dec 05 13:00:37 crc kubenswrapper[4763]: I1205 13:00:37.543869 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:00:37 crc kubenswrapper[4763]: I1205 13:00:37.544618 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:00:53 crc kubenswrapper[4763]: I1205 13:00:53.303360 4763 scope.go:117] "RemoveContainer" containerID="d11b14a9e0f61e87c20f232dc92de7635106a4b26d9029763740a89ddaa98086" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.171438 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415661-qdvsb"] Dec 05 13:01:00 crc kubenswrapper[4763]: E1205 13:01:00.172668 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e74603e-9d66-4cc7-aa0f-489fe472d37e" containerName="collect-profiles" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.172686 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e74603e-9d66-4cc7-aa0f-489fe472d37e" containerName="collect-profiles" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.172935 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e74603e-9d66-4cc7-aa0f-489fe472d37e" containerName="collect-profiles" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.174737 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.187520 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415661-qdvsb"] Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.286993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.287057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.287091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.288073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlgw\" (UniqueName: \"kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.389985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlgw\" (UniqueName: \"kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.390053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.390094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.390123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.396620 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.398523 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.410655 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.430959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlgw\" (UniqueName: \"kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw\") pod \"keystone-cron-29415661-qdvsb\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.497713 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:00 crc kubenswrapper[4763]: I1205 13:01:00.950511 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415661-qdvsb"] Dec 05 13:01:00 crc kubenswrapper[4763]: W1205 13:01:00.959688 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c1231bc_00fe_4fb3_9fb7_7121743e17c9.slice/crio-8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272 WatchSource:0}: Error finding container 8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272: Status 404 returned error can't find the container with id 8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272 Dec 05 13:01:01 crc kubenswrapper[4763]: I1205 13:01:01.225684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415661-qdvsb" event={"ID":"9c1231bc-00fe-4fb3-9fb7-7121743e17c9","Type":"ContainerStarted","Data":"d0801574008811481ae7373a0af7a8e6892e6c61d50ae41c1abf8c222270c1f6"} Dec 05 13:01:01 crc kubenswrapper[4763]: I1205 13:01:01.226031 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415661-qdvsb" event={"ID":"9c1231bc-00fe-4fb3-9fb7-7121743e17c9","Type":"ContainerStarted","Data":"8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272"} Dec 05 13:01:01 crc kubenswrapper[4763]: I1205 13:01:01.253843 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415661-qdvsb" podStartSLOduration=1.253818748 podStartE2EDuration="1.253818748s" podCreationTimestamp="2025-12-05 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:01:01.24610924 +0000 UTC m=+4345.738823993" watchObservedRunningTime="2025-12-05 13:01:01.253818748 +0000 UTC m=+4345.746533491" Dec 05 13:01:04 crc kubenswrapper[4763]: I1205 13:01:04.261558 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c1231bc-00fe-4fb3-9fb7-7121743e17c9" containerID="d0801574008811481ae7373a0af7a8e6892e6c61d50ae41c1abf8c222270c1f6" exitCode=0 Dec 05 13:01:04 crc kubenswrapper[4763]: I1205 13:01:04.261635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415661-qdvsb" event={"ID":"9c1231bc-00fe-4fb3-9fb7-7121743e17c9","Type":"ContainerDied","Data":"d0801574008811481ae7373a0af7a8e6892e6c61d50ae41c1abf8c222270c1f6"} Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.618183 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.801309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvlgw\" (UniqueName: \"kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw\") pod \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.801965 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys\") pod \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.802081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle\") pod \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.802155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data\") pod \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\" (UID: \"9c1231bc-00fe-4fb3-9fb7-7121743e17c9\") " Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.807237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw" (OuterVolumeSpecName: "kube-api-access-cvlgw") pod "9c1231bc-00fe-4fb3-9fb7-7121743e17c9" (UID: "9c1231bc-00fe-4fb3-9fb7-7121743e17c9"). InnerVolumeSpecName "kube-api-access-cvlgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.808601 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9c1231bc-00fe-4fb3-9fb7-7121743e17c9" (UID: "9c1231bc-00fe-4fb3-9fb7-7121743e17c9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.834283 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c1231bc-00fe-4fb3-9fb7-7121743e17c9" (UID: "9c1231bc-00fe-4fb3-9fb7-7121743e17c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.865726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data" (OuterVolumeSpecName: "config-data") pod "9c1231bc-00fe-4fb3-9fb7-7121743e17c9" (UID: "9c1231bc-00fe-4fb3-9fb7-7121743e17c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.905159 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.905198 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.905211 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 13:01:05 crc kubenswrapper[4763]: I1205 13:01:05.905221 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvlgw\" (UniqueName: \"kubernetes.io/projected/9c1231bc-00fe-4fb3-9fb7-7121743e17c9-kube-api-access-cvlgw\") on node \"crc\" DevicePath \"\"" Dec 05 13:01:06 crc kubenswrapper[4763]: I1205 13:01:06.285672 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415661-qdvsb" event={"ID":"9c1231bc-00fe-4fb3-9fb7-7121743e17c9","Type":"ContainerDied","Data":"8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272"} Dec 05 13:01:06 crc kubenswrapper[4763]: I1205 13:01:06.285718 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8893211bf7b3fa3bea6ba4bc46637baa16e5add54f1fc271d4e0dfbe08632272" Dec 05 13:01:06 crc kubenswrapper[4763]: I1205 13:01:06.285728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415661-qdvsb" Dec 05 13:01:07 crc kubenswrapper[4763]: I1205 13:01:07.544936 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:01:07 crc kubenswrapper[4763]: I1205 13:01:07.545312 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:01:37 crc kubenswrapper[4763]: I1205 13:01:37.543872 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:01:37 crc kubenswrapper[4763]: I1205 13:01:37.544496 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:01:37 crc kubenswrapper[4763]: I1205 13:01:37.544545 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:01:37 crc kubenswrapper[4763]: I1205 13:01:37.545376 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:01:37 crc kubenswrapper[4763]: I1205 13:01:37.545437 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" gracePeriod=600 Dec 05 13:01:37 crc kubenswrapper[4763]: E1205 13:01:37.674269 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:01:38 crc kubenswrapper[4763]: I1205 13:01:38.599820 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" exitCode=0 Dec 05 13:01:38 crc kubenswrapper[4763]: I1205 13:01:38.599942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4"} Dec 05 13:01:38 crc kubenswrapper[4763]: I1205 13:01:38.600188 4763 scope.go:117] "RemoveContainer" containerID="619f3eb04487f360e70557a78183a4459c77bf53694a220d2941a47d3d7faea8" Dec 05 13:01:38 crc kubenswrapper[4763]: I1205 13:01:38.600659 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:01:38 crc kubenswrapper[4763]: E1205 13:01:38.601052 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:01:53 crc kubenswrapper[4763]: I1205 13:01:53.784513 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:01:53 crc kubenswrapper[4763]: E1205 13:01:53.785190 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:02:04 crc kubenswrapper[4763]: I1205 13:02:04.785268 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:02:04 crc kubenswrapper[4763]: E1205 13:02:04.786323 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:02:16 crc kubenswrapper[4763]: I1205 13:02:16.784414 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:02:16 crc kubenswrapper[4763]: E1205 13:02:16.785201 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:02:30 crc kubenswrapper[4763]: I1205 13:02:30.783872 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:02:30 crc kubenswrapper[4763]: E1205 13:02:30.784681 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:02:44 crc kubenswrapper[4763]: I1205 13:02:44.784585 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:02:44 crc kubenswrapper[4763]: E1205 13:02:44.785291 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:02:55 crc kubenswrapper[4763]: I1205 13:02:55.790285 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:02:55 crc kubenswrapper[4763]: E1205 13:02:55.791053 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:03:10 crc kubenswrapper[4763]: I1205 13:03:10.784585 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:03:10 crc kubenswrapper[4763]: E1205 13:03:10.785793 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:03:23 crc kubenswrapper[4763]: I1205 13:03:23.784420 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:03:23 crc kubenswrapper[4763]: E1205 13:03:23.785274 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:03:35 crc kubenswrapper[4763]: I1205 13:03:35.784674 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:03:35 crc kubenswrapper[4763]: E1205 13:03:35.786167 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:03:47 crc kubenswrapper[4763]: I1205 13:03:47.785243 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:03:47 crc kubenswrapper[4763]: E1205 13:03:47.786215 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:04:00 crc kubenswrapper[4763]: I1205 13:04:00.784302 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:04:00 crc kubenswrapper[4763]: E1205 13:04:00.785091 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:04:11 crc kubenswrapper[4763]: I1205 13:04:11.784442 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:04:11 crc kubenswrapper[4763]: E1205 13:04:11.785361 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:04:21 crc kubenswrapper[4763]: I1205 13:04:21.996006 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:21 crc kubenswrapper[4763]: E1205 13:04:21.999779 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1231bc-00fe-4fb3-9fb7-7121743e17c9" containerName="keystone-cron" Dec 05 13:04:21 crc kubenswrapper[4763]: I1205 13:04:21.999811 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1231bc-00fe-4fb3-9fb7-7121743e17c9" containerName="keystone-cron" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.000051 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1231bc-00fe-4fb3-9fb7-7121743e17c9" containerName="keystone-cron" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.002003 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.053154 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.161887 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.162024 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thvnh\" (UniqueName: \"kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.162088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.263913 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thvnh\" (UniqueName: \"kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.264231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.264284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.264736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.264985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.287154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thvnh\" (UniqueName: \"kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh\") pod \"redhat-operators-9xncq\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.328251 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:22 crc kubenswrapper[4763]: I1205 13:04:22.818512 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:23 crc kubenswrapper[4763]: I1205 13:04:23.781870 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerID="ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777" exitCode=0 Dec 05 13:04:23 crc kubenswrapper[4763]: I1205 13:04:23.782200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerDied","Data":"ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777"} Dec 05 13:04:23 crc kubenswrapper[4763]: I1205 13:04:23.782238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerStarted","Data":"055fac2aa8a7c04a7b044ccf7e2ddbba4ce42811fc1a34625cfd8e3995364c68"} Dec 05 13:04:23 crc kubenswrapper[4763]: I1205 13:04:23.784611 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:04:24 crc kubenswrapper[4763]: I1205 13:04:24.784559 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:04:24 crc kubenswrapper[4763]: E1205 13:04:24.785554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:04:26 crc kubenswrapper[4763]: I1205 13:04:26.817316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerStarted","Data":"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406"} Dec 05 13:04:28 crc kubenswrapper[4763]: I1205 13:04:28.841501 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerID="4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406" exitCode=0 Dec 05 13:04:28 crc kubenswrapper[4763]: I1205 13:04:28.841567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerDied","Data":"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406"} Dec 05 13:04:31 crc kubenswrapper[4763]: I1205 13:04:31.881227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerStarted","Data":"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953"} Dec 05 13:04:31 crc kubenswrapper[4763]: I1205 13:04:31.917954 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9xncq" podStartSLOduration=4.036613694 podStartE2EDuration="10.917918318s" podCreationTimestamp="2025-12-05 13:04:21 +0000 UTC" firstStartedPulling="2025-12-05 13:04:23.784335997 +0000 UTC m=+4548.277050720" lastFinishedPulling="2025-12-05 13:04:30.665640621 +0000 UTC m=+4555.158355344" observedRunningTime="2025-12-05 13:04:31.900425917 +0000 UTC m=+4556.393140680" watchObservedRunningTime="2025-12-05 13:04:31.917918318 +0000 UTC m=+4556.410633091" Dec 05 13:04:32 crc kubenswrapper[4763]: I1205 13:04:32.329197 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:32 crc kubenswrapper[4763]: I1205 13:04:32.329259 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:33 crc kubenswrapper[4763]: I1205 13:04:33.375039 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9xncq" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="registry-server" probeResult="failure" output=< Dec 05 13:04:33 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 13:04:33 crc kubenswrapper[4763]: > Dec 05 13:04:35 crc kubenswrapper[4763]: I1205 13:04:35.809926 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:04:35 crc kubenswrapper[4763]: E1205 13:04:35.811849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:04:42 crc kubenswrapper[4763]: I1205 13:04:42.389009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:42 crc kubenswrapper[4763]: I1205 13:04:42.437855 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:42 crc kubenswrapper[4763]: I1205 13:04:42.623647 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.000529 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9xncq" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="registry-server" containerID="cri-o://8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953" gracePeriod=2 Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.539500 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.689497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thvnh\" (UniqueName: \"kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh\") pod \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.689558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities\") pod \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.689594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content\") pod \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\" (UID: \"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280\") " Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.691716 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities" (OuterVolumeSpecName: "utilities") pod "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" (UID: "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.697043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh" (OuterVolumeSpecName: "kube-api-access-thvnh") pod "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" (UID: "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280"). InnerVolumeSpecName "kube-api-access-thvnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.791822 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thvnh\" (UniqueName: \"kubernetes.io/projected/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-kube-api-access-thvnh\") on node \"crc\" DevicePath \"\"" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.791854 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.805691 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" (UID: "2a144ffe-c3e4-4a24-8e3a-3aa866ed4280"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:04:44 crc kubenswrapper[4763]: I1205 13:04:44.894203 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.012780 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerID="8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953" exitCode=0 Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.012841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerDied","Data":"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953"} Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.013746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xncq" event={"ID":"2a144ffe-c3e4-4a24-8e3a-3aa866ed4280","Type":"ContainerDied","Data":"055fac2aa8a7c04a7b044ccf7e2ddbba4ce42811fc1a34625cfd8e3995364c68"} Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.012881 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xncq" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.013793 4763 scope.go:117] "RemoveContainer" containerID="8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.056095 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.060415 4763 scope.go:117] "RemoveContainer" containerID="4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.067669 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9xncq"] Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.089035 4763 scope.go:117] "RemoveContainer" containerID="ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.135672 4763 scope.go:117] "RemoveContainer" containerID="8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953" Dec 05 13:04:45 crc kubenswrapper[4763]: E1205 13:04:45.136448 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953\": container with ID starting with 8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953 not found: ID does not exist" containerID="8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.136522 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953"} err="failed to get container status \"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953\": rpc error: code = NotFound desc = could not find container \"8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953\": container with ID starting with 8abe724b6ba264ab0aa01129fef8b7a613ff8cabb46bfa70a39a44c45b968953 not found: ID does not exist" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.136579 4763 scope.go:117] "RemoveContainer" containerID="4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406" Dec 05 13:04:45 crc kubenswrapper[4763]: E1205 13:04:45.137201 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406\": container with ID starting with 4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406 not found: ID does not exist" containerID="4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.137255 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406"} err="failed to get container status \"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406\": rpc error: code = NotFound desc = could not find container \"4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406\": container with ID starting with 4a68eda61c94ef8852897d41c1968c19180a15fe970a7d06c2686649b481f406 not found: ID does not exist" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.137286 4763 scope.go:117] "RemoveContainer" containerID="ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777" Dec 05 13:04:45 crc kubenswrapper[4763]: E1205 13:04:45.137751 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777\": container with ID starting with ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777 not found: ID does not exist" containerID="ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.137801 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777"} err="failed to get container status \"ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777\": rpc error: code = NotFound desc = could not find container \"ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777\": container with ID starting with ef38374d38322fccfd4c7a1316202ddb2a4b71a4feee12644e682f8d8cf1d777 not found: ID does not exist" Dec 05 13:04:45 crc kubenswrapper[4763]: I1205 13:04:45.800419 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" path="/var/lib/kubelet/pods/2a144ffe-c3e4-4a24-8e3a-3aa866ed4280/volumes" Dec 05 13:04:49 crc kubenswrapper[4763]: I1205 13:04:49.784093 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:04:49 crc kubenswrapper[4763]: E1205 13:04:49.784923 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:05:02 crc kubenswrapper[4763]: I1205 13:05:02.788179 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:05:02 crc kubenswrapper[4763]: E1205 13:05:02.790927 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:05:17 crc kubenswrapper[4763]: I1205 13:05:17.784800 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:05:17 crc kubenswrapper[4763]: E1205 13:05:17.785630 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:05:30 crc kubenswrapper[4763]: I1205 13:05:30.784596 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:05:30 crc kubenswrapper[4763]: E1205 13:05:30.785613 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:05:41 crc kubenswrapper[4763]: I1205 13:05:41.785011 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:05:41 crc kubenswrapper[4763]: E1205 13:05:41.786033 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:05:55 crc kubenswrapper[4763]: I1205 13:05:55.798046 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:05:55 crc kubenswrapper[4763]: E1205 13:05:55.799117 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:06:06 crc kubenswrapper[4763]: I1205 13:06:06.784197 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:06:06 crc kubenswrapper[4763]: E1205 13:06:06.785041 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:06:21 crc kubenswrapper[4763]: I1205 13:06:21.784136 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:06:21 crc kubenswrapper[4763]: E1205 13:06:21.784859 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:06:35 crc kubenswrapper[4763]: I1205 13:06:35.796019 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:06:35 crc kubenswrapper[4763]: E1205 13:06:35.797120 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:06:47 crc kubenswrapper[4763]: I1205 13:06:47.784044 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:06:49 crc kubenswrapper[4763]: I1205 13:06:49.336812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9"} Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.142495 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:00 crc kubenswrapper[4763]: E1205 13:08:00.143681 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="registry-server" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.143699 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="registry-server" Dec 05 13:08:00 crc kubenswrapper[4763]: E1205 13:08:00.143713 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="extract-content" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.143722 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="extract-content" Dec 05 13:08:00 crc kubenswrapper[4763]: E1205 13:08:00.143826 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="extract-utilities" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.143839 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="extract-utilities" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.145265 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a144ffe-c3e4-4a24-8e3a-3aa866ed4280" containerName="registry-server" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.147038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.157571 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.311151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.311235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.311300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdvv\" (UniqueName: \"kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.414705 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.414809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdvv\" (UniqueName: \"kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.414943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.415299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.415344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.436195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdvv\" (UniqueName: \"kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv\") pod \"community-operators-zwf58\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:00 crc kubenswrapper[4763]: I1205 13:08:00.474712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:01 crc kubenswrapper[4763]: I1205 13:08:01.035298 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:01 crc kubenswrapper[4763]: I1205 13:08:01.064347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerStarted","Data":"e0fe562b03a5542504c7de9c46c6209ad606addf1d226b74ad0c6d88b8875fc8"} Dec 05 13:08:02 crc kubenswrapper[4763]: I1205 13:08:02.076119 4763 generic.go:334] "Generic (PLEG): container finished" podID="1320eb18-a39a-4787-b069-51de6fc90985" containerID="6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26" exitCode=0 Dec 05 13:08:02 crc kubenswrapper[4763]: I1205 13:08:02.076193 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerDied","Data":"6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26"} Dec 05 13:08:05 crc kubenswrapper[4763]: I1205 13:08:05.107896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerStarted","Data":"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2"} Dec 05 13:08:05 crc kubenswrapper[4763]: E1205 13:08:05.250359 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1320eb18_a39a_4787_b069_51de6fc90985.slice/crio-1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.118089 4763 generic.go:334] "Generic (PLEG): container finished" podID="1320eb18-a39a-4787-b069-51de6fc90985" containerID="1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2" exitCode=0 Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.118132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerDied","Data":"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2"} Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.521671 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.525032 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.533292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gfn4\" (UniqueName: \"kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.533346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.533381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.547283 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.635135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gfn4\" (UniqueName: \"kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.635186 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.635217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.635865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.635952 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.668746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gfn4\" (UniqueName: \"kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4\") pod \"redhat-marketplace-9b72d\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:06 crc kubenswrapper[4763]: I1205 13:08:06.848831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:07 crc kubenswrapper[4763]: I1205 13:08:07.150877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerStarted","Data":"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400"} Dec 05 13:08:07 crc kubenswrapper[4763]: I1205 13:08:07.180668 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwf58" podStartSLOduration=2.654274818 podStartE2EDuration="7.180648123s" podCreationTimestamp="2025-12-05 13:08:00 +0000 UTC" firstStartedPulling="2025-12-05 13:08:02.077775206 +0000 UTC m=+4766.570489929" lastFinishedPulling="2025-12-05 13:08:06.604148511 +0000 UTC m=+4771.096863234" observedRunningTime="2025-12-05 13:08:07.172324049 +0000 UTC m=+4771.665038782" watchObservedRunningTime="2025-12-05 13:08:07.180648123 +0000 UTC m=+4771.673362846" Dec 05 13:08:07 crc kubenswrapper[4763]: I1205 13:08:07.433573 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:08 crc kubenswrapper[4763]: I1205 13:08:08.166782 4763 generic.go:334] "Generic (PLEG): container finished" podID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerID="1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8" exitCode=0 Dec 05 13:08:08 crc kubenswrapper[4763]: I1205 13:08:08.166875 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerDied","Data":"1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8"} Dec 05 13:08:08 crc kubenswrapper[4763]: I1205 13:08:08.168309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerStarted","Data":"1b2849c7e159402f6e0080753734d48abbf85038f509493898cceb0b2d18a3d8"} Dec 05 13:08:09 crc kubenswrapper[4763]: I1205 13:08:09.182731 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerStarted","Data":"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e"} Dec 05 13:08:10 crc kubenswrapper[4763]: I1205 13:08:10.195515 4763 generic.go:334] "Generic (PLEG): container finished" podID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerID="eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e" exitCode=0 Dec 05 13:08:10 crc kubenswrapper[4763]: I1205 13:08:10.195558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerDied","Data":"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e"} Dec 05 13:08:10 crc kubenswrapper[4763]: I1205 13:08:10.475250 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:10 crc kubenswrapper[4763]: I1205 13:08:10.475863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:10 crc kubenswrapper[4763]: I1205 13:08:10.545395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:11 crc kubenswrapper[4763]: I1205 13:08:11.253999 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:12 crc kubenswrapper[4763]: I1205 13:08:12.215358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerStarted","Data":"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf"} Dec 05 13:08:12 crc kubenswrapper[4763]: I1205 13:08:12.238582 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9b72d" podStartSLOduration=3.350222876 podStartE2EDuration="6.238564349s" podCreationTimestamp="2025-12-05 13:08:06 +0000 UTC" firstStartedPulling="2025-12-05 13:08:08.168227434 +0000 UTC m=+4772.660942157" lastFinishedPulling="2025-12-05 13:08:11.056568907 +0000 UTC m=+4775.549283630" observedRunningTime="2025-12-05 13:08:12.232528396 +0000 UTC m=+4776.725243119" watchObservedRunningTime="2025-12-05 13:08:12.238564349 +0000 UTC m=+4776.731279072" Dec 05 13:08:12 crc kubenswrapper[4763]: I1205 13:08:12.913472 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.226784 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwf58" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="registry-server" containerID="cri-o://a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400" gracePeriod=2 Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.758869 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.874435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsdvv\" (UniqueName: \"kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv\") pod \"1320eb18-a39a-4787-b069-51de6fc90985\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.874715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content\") pod \"1320eb18-a39a-4787-b069-51de6fc90985\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.874795 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities\") pod \"1320eb18-a39a-4787-b069-51de6fc90985\" (UID: \"1320eb18-a39a-4787-b069-51de6fc90985\") " Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.875592 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities" (OuterVolumeSpecName: "utilities") pod "1320eb18-a39a-4787-b069-51de6fc90985" (UID: "1320eb18-a39a-4787-b069-51de6fc90985"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.876798 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.881645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv" (OuterVolumeSpecName: "kube-api-access-gsdvv") pod "1320eb18-a39a-4787-b069-51de6fc90985" (UID: "1320eb18-a39a-4787-b069-51de6fc90985"). InnerVolumeSpecName "kube-api-access-gsdvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.933876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1320eb18-a39a-4787-b069-51de6fc90985" (UID: "1320eb18-a39a-4787-b069-51de6fc90985"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.978776 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1320eb18-a39a-4787-b069-51de6fc90985-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:13 crc kubenswrapper[4763]: I1205 13:08:13.979060 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsdvv\" (UniqueName: \"kubernetes.io/projected/1320eb18-a39a-4787-b069-51de6fc90985-kube-api-access-gsdvv\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.237781 4763 generic.go:334] "Generic (PLEG): container finished" podID="1320eb18-a39a-4787-b069-51de6fc90985" containerID="a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400" exitCode=0 Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.237833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerDied","Data":"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400"} Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.237868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf58" event={"ID":"1320eb18-a39a-4787-b069-51de6fc90985","Type":"ContainerDied","Data":"e0fe562b03a5542504c7de9c46c6209ad606addf1d226b74ad0c6d88b8875fc8"} Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.237888 4763 scope.go:117] "RemoveContainer" containerID="a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.238527 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf58" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.260160 4763 scope.go:117] "RemoveContainer" containerID="1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.277131 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.285630 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwf58"] Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.299223 4763 scope.go:117] "RemoveContainer" containerID="6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.335054 4763 scope.go:117] "RemoveContainer" containerID="a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400" Dec 05 13:08:14 crc kubenswrapper[4763]: E1205 13:08:14.335435 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400\": container with ID starting with a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400 not found: ID does not exist" containerID="a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.335473 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400"} err="failed to get container status \"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400\": rpc error: code = NotFound desc = could not find container \"a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400\": container with ID starting with a1bed21a3512dd3903c8c77945187a432cbef37041fd0f2b84e2b9f10c5cc400 not found: ID does not exist" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.335498 4763 scope.go:117] "RemoveContainer" containerID="1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2" Dec 05 13:08:14 crc kubenswrapper[4763]: E1205 13:08:14.335925 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2\": container with ID starting with 1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2 not found: ID does not exist" containerID="1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.336026 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2"} err="failed to get container status \"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2\": rpc error: code = NotFound desc = could not find container \"1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2\": container with ID starting with 1ef7d316086c9933a07dffe05d294d345ddf4ed75c7fe227a656a8f63355cad2 not found: ID does not exist" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.336080 4763 scope.go:117] "RemoveContainer" containerID="6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26" Dec 05 13:08:14 crc kubenswrapper[4763]: E1205 13:08:14.336423 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26\": container with ID starting with 6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26 not found: ID does not exist" containerID="6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26" Dec 05 13:08:14 crc kubenswrapper[4763]: I1205 13:08:14.336452 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26"} err="failed to get container status \"6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26\": rpc error: code = NotFound desc = could not find container \"6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26\": container with ID starting with 6a912e09b4c3a47c5209fe09e514d25259aba1bd2ac476aa2fb49b87ef15eb26 not found: ID does not exist" Dec 05 13:08:15 crc kubenswrapper[4763]: I1205 13:08:15.802559 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1320eb18-a39a-4787-b069-51de6fc90985" path="/var/lib/kubelet/pods/1320eb18-a39a-4787-b069-51de6fc90985/volumes" Dec 05 13:08:16 crc kubenswrapper[4763]: I1205 13:08:16.850148 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:16 crc kubenswrapper[4763]: I1205 13:08:16.851196 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:16 crc kubenswrapper[4763]: I1205 13:08:16.901941 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:17 crc kubenswrapper[4763]: I1205 13:08:17.311575 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:17 crc kubenswrapper[4763]: I1205 13:08:17.912617 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:19 crc kubenswrapper[4763]: I1205 13:08:19.283384 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9b72d" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="registry-server" containerID="cri-o://6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf" gracePeriod=2 Dec 05 13:08:19 crc kubenswrapper[4763]: I1205 13:08:19.863266 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:19 crc kubenswrapper[4763]: I1205 13:08:19.999634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content\") pod \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " Dec 05 13:08:19 crc kubenswrapper[4763]: I1205 13:08:19.999928 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gfn4\" (UniqueName: \"kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4\") pod \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.000038 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities\") pod \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\" (UID: \"1e362e2b-ed3a-4ca6-a986-28fdc927c226\") " Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.001493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities" (OuterVolumeSpecName: "utilities") pod "1e362e2b-ed3a-4ca6-a986-28fdc927c226" (UID: "1e362e2b-ed3a-4ca6-a986-28fdc927c226"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.008606 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4" (OuterVolumeSpecName: "kube-api-access-7gfn4") pod "1e362e2b-ed3a-4ca6-a986-28fdc927c226" (UID: "1e362e2b-ed3a-4ca6-a986-28fdc927c226"). InnerVolumeSpecName "kube-api-access-7gfn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.020531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e362e2b-ed3a-4ca6-a986-28fdc927c226" (UID: "1e362e2b-ed3a-4ca6-a986-28fdc927c226"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.102793 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.102827 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e362e2b-ed3a-4ca6-a986-28fdc927c226-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.102837 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gfn4\" (UniqueName: \"kubernetes.io/projected/1e362e2b-ed3a-4ca6-a986-28fdc927c226-kube-api-access-7gfn4\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.294352 4763 generic.go:334] "Generic (PLEG): container finished" podID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerID="6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf" exitCode=0 Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.294396 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b72d" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.294394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerDied","Data":"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf"} Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.294538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b72d" event={"ID":"1e362e2b-ed3a-4ca6-a986-28fdc927c226","Type":"ContainerDied","Data":"1b2849c7e159402f6e0080753734d48abbf85038f509493898cceb0b2d18a3d8"} Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.294575 4763 scope.go:117] "RemoveContainer" containerID="6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.315711 4763 scope.go:117] "RemoveContainer" containerID="eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.328571 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.340180 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b72d"] Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.346526 4763 scope.go:117] "RemoveContainer" containerID="1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.396182 4763 scope.go:117] "RemoveContainer" containerID="6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf" Dec 05 13:08:20 crc kubenswrapper[4763]: E1205 13:08:20.396864 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf\": container with ID starting with 6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf not found: ID does not exist" containerID="6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.396912 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf"} err="failed to get container status \"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf\": rpc error: code = NotFound desc = could not find container \"6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf\": container with ID starting with 6d0f11c91c55f347fe7bfee60a26b63b91f48c5a05df5b0cdf95af85ce84fbcf not found: ID does not exist" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.396944 4763 scope.go:117] "RemoveContainer" containerID="eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e" Dec 05 13:08:20 crc kubenswrapper[4763]: E1205 13:08:20.397645 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e\": container with ID starting with eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e not found: ID does not exist" containerID="eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.397683 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e"} err="failed to get container status \"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e\": rpc error: code = NotFound desc = could not find container \"eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e\": container with ID starting with eff69f09c35a9f8797e47035a038a450df343d813925a3861fb6285a44e21a1e not found: ID does not exist" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.397712 4763 scope.go:117] "RemoveContainer" containerID="1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8" Dec 05 13:08:20 crc kubenswrapper[4763]: E1205 13:08:20.402115 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8\": container with ID starting with 1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8 not found: ID does not exist" containerID="1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8" Dec 05 13:08:20 crc kubenswrapper[4763]: I1205 13:08:20.402157 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8"} err="failed to get container status \"1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8\": rpc error: code = NotFound desc = could not find container \"1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8\": container with ID starting with 1ecf1e1f005ae513ac44d45f05fb3e5d895a997694aa9926ad3a6b18dfe18bb8 not found: ID does not exist" Dec 05 13:08:21 crc kubenswrapper[4763]: I1205 13:08:21.804095 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" path="/var/lib/kubelet/pods/1e362e2b-ed3a-4ca6-a986-28fdc927c226/volumes" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.797565 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799144 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="extract-utilities" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799162 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="extract-utilities" Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799181 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799187 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799210 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="extract-content" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799216 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="extract-content" Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799242 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="extract-content" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799248 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="extract-content" Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799266 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="extract-utilities" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799272 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="extract-utilities" Dec 05 13:08:32 crc kubenswrapper[4763]: E1205 13:08:32.799296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799302 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799635 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e362e2b-ed3a-4ca6-a986-28fdc927c226" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.799653 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1320eb18-a39a-4787-b069-51de6fc90985" containerName="registry-server" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.805268 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.814213 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.854218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqqp\" (UniqueName: \"kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.854350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.854508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.956571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.957084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqqp\" (UniqueName: \"kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.957150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.957227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.957620 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:32 crc kubenswrapper[4763]: I1205 13:08:32.983611 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqqp\" (UniqueName: \"kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp\") pod \"certified-operators-fnw2l\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:33 crc kubenswrapper[4763]: I1205 13:08:33.146599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:33 crc kubenswrapper[4763]: I1205 13:08:33.695586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:34 crc kubenswrapper[4763]: I1205 13:08:34.445241 4763 generic.go:334] "Generic (PLEG): container finished" podID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerID="1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5" exitCode=0 Dec 05 13:08:34 crc kubenswrapper[4763]: I1205 13:08:34.445310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerDied","Data":"1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5"} Dec 05 13:08:34 crc kubenswrapper[4763]: I1205 13:08:34.445610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerStarted","Data":"8e73f82cd232dc8358368242d4518493949384a18e6df91499278fd3cb7ffd00"} Dec 05 13:08:38 crc kubenswrapper[4763]: I1205 13:08:38.483376 4763 generic.go:334] "Generic (PLEG): container finished" podID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerID="429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c" exitCode=0 Dec 05 13:08:38 crc kubenswrapper[4763]: I1205 13:08:38.483470 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerDied","Data":"429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c"} Dec 05 13:08:39 crc kubenswrapper[4763]: I1205 13:08:39.503677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerStarted","Data":"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217"} Dec 05 13:08:39 crc kubenswrapper[4763]: I1205 13:08:39.528837 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fnw2l" podStartSLOduration=2.9638018649999998 podStartE2EDuration="7.52881587s" podCreationTimestamp="2025-12-05 13:08:32 +0000 UTC" firstStartedPulling="2025-12-05 13:08:34.447227096 +0000 UTC m=+4798.939941819" lastFinishedPulling="2025-12-05 13:08:39.012241091 +0000 UTC m=+4803.504955824" observedRunningTime="2025-12-05 13:08:39.522895311 +0000 UTC m=+4804.015610034" watchObservedRunningTime="2025-12-05 13:08:39.52881587 +0000 UTC m=+4804.021530603" Dec 05 13:08:43 crc kubenswrapper[4763]: I1205 13:08:43.147949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:43 crc kubenswrapper[4763]: I1205 13:08:43.148672 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:43 crc kubenswrapper[4763]: I1205 13:08:43.217942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:53 crc kubenswrapper[4763]: I1205 13:08:53.195896 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:53 crc kubenswrapper[4763]: I1205 13:08:53.246762 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:53 crc kubenswrapper[4763]: I1205 13:08:53.631905 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fnw2l" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="registry-server" containerID="cri-o://53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217" gracePeriod=2 Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.096186 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.231509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities\") pod \"32cfacb9-bb1d-461a-8b07-69effe15268e\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.231653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content\") pod \"32cfacb9-bb1d-461a-8b07-69effe15268e\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.231674 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqqp\" (UniqueName: \"kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp\") pod \"32cfacb9-bb1d-461a-8b07-69effe15268e\" (UID: \"32cfacb9-bb1d-461a-8b07-69effe15268e\") " Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.232860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities" (OuterVolumeSpecName: "utilities") pod "32cfacb9-bb1d-461a-8b07-69effe15268e" (UID: "32cfacb9-bb1d-461a-8b07-69effe15268e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.239807 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp" (OuterVolumeSpecName: "kube-api-access-sqqqp") pod "32cfacb9-bb1d-461a-8b07-69effe15268e" (UID: "32cfacb9-bb1d-461a-8b07-69effe15268e"). InnerVolumeSpecName "kube-api-access-sqqqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.279951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32cfacb9-bb1d-461a-8b07-69effe15268e" (UID: "32cfacb9-bb1d-461a-8b07-69effe15268e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.334314 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.334370 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqqp\" (UniqueName: \"kubernetes.io/projected/32cfacb9-bb1d-461a-8b07-69effe15268e-kube-api-access-sqqqp\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.334384 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfacb9-bb1d-461a-8b07-69effe15268e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.657771 4763 generic.go:334] "Generic (PLEG): container finished" podID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerID="53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217" exitCode=0 Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.657942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerDied","Data":"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217"} Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.657947 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnw2l" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.657976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnw2l" event={"ID":"32cfacb9-bb1d-461a-8b07-69effe15268e","Type":"ContainerDied","Data":"8e73f82cd232dc8358368242d4518493949384a18e6df91499278fd3cb7ffd00"} Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.658012 4763 scope.go:117] "RemoveContainer" containerID="53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.679549 4763 scope.go:117] "RemoveContainer" containerID="429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.698732 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.709140 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fnw2l"] Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.722463 4763 scope.go:117] "RemoveContainer" containerID="1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.756444 4763 scope.go:117] "RemoveContainer" containerID="53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217" Dec 05 13:08:54 crc kubenswrapper[4763]: E1205 13:08:54.757194 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217\": container with ID starting with 53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217 not found: ID does not exist" containerID="53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.757244 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217"} err="failed to get container status \"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217\": rpc error: code = NotFound desc = could not find container \"53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217\": container with ID starting with 53dbbd017546034489f44f5776f77c264124464a4f242abc01f4a95300e84217 not found: ID does not exist" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.757276 4763 scope.go:117] "RemoveContainer" containerID="429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c" Dec 05 13:08:54 crc kubenswrapper[4763]: E1205 13:08:54.758028 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c\": container with ID starting with 429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c not found: ID does not exist" containerID="429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.758068 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c"} err="failed to get container status \"429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c\": rpc error: code = NotFound desc = could not find container \"429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c\": container with ID starting with 429cf3bae2098daf9dc061ce9e8d17c0b63ece48c9a82e189c7e00bf0525b89c not found: ID does not exist" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.758091 4763 scope.go:117] "RemoveContainer" containerID="1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5" Dec 05 13:08:54 crc kubenswrapper[4763]: E1205 13:08:54.759430 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5\": container with ID starting with 1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5 not found: ID does not exist" containerID="1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5" Dec 05 13:08:54 crc kubenswrapper[4763]: I1205 13:08:54.759459 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5"} err="failed to get container status \"1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5\": rpc error: code = NotFound desc = could not find container \"1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5\": container with ID starting with 1a073000b033c1f74ad39f914a8193e77ec4a4887ae6eba6289b40eae9f7f8c5 not found: ID does not exist" Dec 05 13:08:55 crc kubenswrapper[4763]: I1205 13:08:55.796454 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" path="/var/lib/kubelet/pods/32cfacb9-bb1d-461a-8b07-69effe15268e/volumes" Dec 05 13:09:07 crc kubenswrapper[4763]: I1205 13:09:07.544382 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:09:07 crc kubenswrapper[4763]: I1205 13:09:07.545272 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:09:37 crc kubenswrapper[4763]: I1205 13:09:37.546376 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:09:37 crc kubenswrapper[4763]: I1205 13:09:37.548211 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:10:07 crc kubenswrapper[4763]: I1205 13:10:07.544313 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:10:07 crc kubenswrapper[4763]: I1205 13:10:07.545042 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:10:07 crc kubenswrapper[4763]: I1205 13:10:07.545111 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:10:07 crc kubenswrapper[4763]: I1205 13:10:07.546094 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:10:07 crc kubenswrapper[4763]: I1205 13:10:07.546157 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9" gracePeriod=600 Dec 05 13:10:08 crc kubenswrapper[4763]: I1205 13:10:08.385800 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9" exitCode=0 Dec 05 13:10:08 crc kubenswrapper[4763]: I1205 13:10:08.385887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9"} Dec 05 13:10:08 crc kubenswrapper[4763]: I1205 13:10:08.386336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f"} Dec 05 13:10:08 crc kubenswrapper[4763]: I1205 13:10:08.386357 4763 scope.go:117] "RemoveContainer" containerID="62e35b6bccff0ea60092714c9fbbca340c8f353206db0f49a2b69c7f3a1925d4" Dec 05 13:12:07 crc kubenswrapper[4763]: I1205 13:12:07.543791 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:12:07 crc kubenswrapper[4763]: I1205 13:12:07.544367 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:12:37 crc kubenswrapper[4763]: I1205 13:12:37.543852 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:12:37 crc kubenswrapper[4763]: I1205 13:12:37.544705 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:13:07 crc kubenswrapper[4763]: I1205 13:13:07.544491 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:13:07 crc kubenswrapper[4763]: I1205 13:13:07.545137 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:13:07 crc kubenswrapper[4763]: I1205 13:13:07.545190 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:13:07 crc kubenswrapper[4763]: I1205 13:13:07.546051 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:13:07 crc kubenswrapper[4763]: I1205 13:13:07.546119 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" gracePeriod=600 Dec 05 13:13:07 crc kubenswrapper[4763]: E1205 13:13:07.674008 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:13:08 crc kubenswrapper[4763]: I1205 13:13:08.102840 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" exitCode=0 Dec 05 13:13:08 crc kubenswrapper[4763]: I1205 13:13:08.102924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f"} Dec 05 13:13:08 crc kubenswrapper[4763]: I1205 13:13:08.102984 4763 scope.go:117] "RemoveContainer" containerID="6cb111387aaca0b2a39b308e8c28b211d65ff042187e0815f4be96631e9ef1e9" Dec 05 13:13:08 crc kubenswrapper[4763]: I1205 13:13:08.103799 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:13:08 crc kubenswrapper[4763]: E1205 13:13:08.104213 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:13:22 crc kubenswrapper[4763]: I1205 13:13:19.784417 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:13:22 crc kubenswrapper[4763]: E1205 13:13:19.785150 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:13:34 crc kubenswrapper[4763]: I1205 13:13:34.784362 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:13:34 crc kubenswrapper[4763]: E1205 13:13:34.785216 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:13:46 crc kubenswrapper[4763]: I1205 13:13:46.783653 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:13:46 crc kubenswrapper[4763]: E1205 13:13:46.784600 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:13:57 crc kubenswrapper[4763]: I1205 13:13:57.785020 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:13:57 crc kubenswrapper[4763]: E1205 13:13:57.786040 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:14:10 crc kubenswrapper[4763]: I1205 13:14:10.784057 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:14:10 crc kubenswrapper[4763]: E1205 13:14:10.784917 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:14:21 crc kubenswrapper[4763]: I1205 13:14:21.785050 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:14:21 crc kubenswrapper[4763]: E1205 13:14:21.786041 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:14:36 crc kubenswrapper[4763]: I1205 13:14:36.784582 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:14:36 crc kubenswrapper[4763]: E1205 13:14:36.785988 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:14:47 crc kubenswrapper[4763]: I1205 13:14:47.783832 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:14:47 crc kubenswrapper[4763]: E1205 13:14:47.784662 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:14:59 crc kubenswrapper[4763]: I1205 13:14:59.784609 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:14:59 crc kubenswrapper[4763]: E1205 13:14:59.785690 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.214108 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72"] Dec 05 13:15:00 crc kubenswrapper[4763]: E1205 13:15:00.214597 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="extract-content" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.214616 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="extract-content" Dec 05 13:15:00 crc kubenswrapper[4763]: E1205 13:15:00.214628 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="extract-utilities" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.214635 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="extract-utilities" Dec 05 13:15:00 crc kubenswrapper[4763]: E1205 13:15:00.214645 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="registry-server" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.214651 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="registry-server" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.214920 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="32cfacb9-bb1d-461a-8b07-69effe15268e" containerName="registry-server" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.215607 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.218304 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.219177 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.226194 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72"] Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.393050 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzc9z\" (UniqueName: \"kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.393122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.393207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.494639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzc9z\" (UniqueName: \"kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.494718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.494826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.495944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.507399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.513768 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzc9z\" (UniqueName: \"kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z\") pod \"collect-profiles-29415675-slc72\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:00 crc kubenswrapper[4763]: I1205 13:15:00.549609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:01 crc kubenswrapper[4763]: I1205 13:15:01.001588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72"] Dec 05 13:15:01 crc kubenswrapper[4763]: I1205 13:15:01.262084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" event={"ID":"11294be2-8577-4016-8754-baac9ce43eac","Type":"ContainerStarted","Data":"9cf8e28d33f1051b5f40a80f2065359e24ae63d36c48e8dcd0671482cac2f667"} Dec 05 13:15:01 crc kubenswrapper[4763]: I1205 13:15:01.262125 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" event={"ID":"11294be2-8577-4016-8754-baac9ce43eac","Type":"ContainerStarted","Data":"5c3f6f465c7deaa2000f1f6f6b44bf4f078d78c7bdc1b20a6336a2663eb91d56"} Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.283836 4763 generic.go:334] "Generic (PLEG): container finished" podID="11294be2-8577-4016-8754-baac9ce43eac" containerID="9cf8e28d33f1051b5f40a80f2065359e24ae63d36c48e8dcd0671482cac2f667" exitCode=0 Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.283966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" event={"ID":"11294be2-8577-4016-8754-baac9ce43eac","Type":"ContainerDied","Data":"9cf8e28d33f1051b5f40a80f2065359e24ae63d36c48e8dcd0671482cac2f667"} Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.493967 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.495863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.510114 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.638432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqg27\" (UniqueName: \"kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.638534 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.638570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.740187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.740252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.740428 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqg27\" (UniqueName: \"kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.740697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.740997 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.761751 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqg27\" (UniqueName: \"kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27\") pod \"redhat-operators-56mb7\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:02 crc kubenswrapper[4763]: I1205 13:15:02.867250 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.325053 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:03 crc kubenswrapper[4763]: W1205 13:15:03.332290 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39153564_9da5_4b52_881c_db524c3d3634.slice/crio-691d446f6b498005f52eebe08d6ee5eadf397b8326444b6213f6a69239c01a5d WatchSource:0}: Error finding container 691d446f6b498005f52eebe08d6ee5eadf397b8326444b6213f6a69239c01a5d: Status 404 returned error can't find the container with id 691d446f6b498005f52eebe08d6ee5eadf397b8326444b6213f6a69239c01a5d Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.629241 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.656615 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume\") pod \"11294be2-8577-4016-8754-baac9ce43eac\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.656695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume\") pod \"11294be2-8577-4016-8754-baac9ce43eac\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.656853 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzc9z\" (UniqueName: \"kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z\") pod \"11294be2-8577-4016-8754-baac9ce43eac\" (UID: \"11294be2-8577-4016-8754-baac9ce43eac\") " Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.657483 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume" (OuterVolumeSpecName: "config-volume") pod "11294be2-8577-4016-8754-baac9ce43eac" (UID: "11294be2-8577-4016-8754-baac9ce43eac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.658147 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11294be2-8577-4016-8754-baac9ce43eac-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.663517 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11294be2-8577-4016-8754-baac9ce43eac" (UID: "11294be2-8577-4016-8754-baac9ce43eac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.664025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z" (OuterVolumeSpecName: "kube-api-access-pzc9z") pod "11294be2-8577-4016-8754-baac9ce43eac" (UID: "11294be2-8577-4016-8754-baac9ce43eac"). InnerVolumeSpecName "kube-api-access-pzc9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.759486 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11294be2-8577-4016-8754-baac9ce43eac-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:03 crc kubenswrapper[4763]: I1205 13:15:03.759512 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzc9z\" (UniqueName: \"kubernetes.io/projected/11294be2-8577-4016-8754-baac9ce43eac-kube-api-access-pzc9z\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.309480 4763 generic.go:334] "Generic (PLEG): container finished" podID="39153564-9da5-4b52-881c-db524c3d3634" containerID="b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec" exitCode=0 Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.309900 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerDied","Data":"b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec"} Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.309932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerStarted","Data":"691d446f6b498005f52eebe08d6ee5eadf397b8326444b6213f6a69239c01a5d"} Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.313258 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.317320 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" event={"ID":"11294be2-8577-4016-8754-baac9ce43eac","Type":"ContainerDied","Data":"5c3f6f465c7deaa2000f1f6f6b44bf4f078d78c7bdc1b20a6336a2663eb91d56"} Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.317362 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c3f6f465c7deaa2000f1f6f6b44bf4f078d78c7bdc1b20a6336a2663eb91d56" Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.317424 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-slc72" Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.374560 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc"] Dec 05 13:15:04 crc kubenswrapper[4763]: I1205 13:15:04.385460 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415630-6mnpc"] Dec 05 13:15:05 crc kubenswrapper[4763]: I1205 13:15:05.798362 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cd560d-1934-43d9-b3dd-5a1f16b0d880" path="/var/lib/kubelet/pods/15cd560d-1934-43d9-b3dd-5a1f16b0d880/volumes" Dec 05 13:15:06 crc kubenswrapper[4763]: I1205 13:15:06.336130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerStarted","Data":"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669"} Dec 05 13:15:09 crc kubenswrapper[4763]: I1205 13:15:09.364897 4763 generic.go:334] "Generic (PLEG): container finished" podID="39153564-9da5-4b52-881c-db524c3d3634" containerID="ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669" exitCode=0 Dec 05 13:15:09 crc kubenswrapper[4763]: I1205 13:15:09.364989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerDied","Data":"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669"} Dec 05 13:15:11 crc kubenswrapper[4763]: I1205 13:15:11.387989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerStarted","Data":"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad"} Dec 05 13:15:11 crc kubenswrapper[4763]: I1205 13:15:11.409881 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56mb7" podStartSLOduration=3.414610768 podStartE2EDuration="9.409862067s" podCreationTimestamp="2025-12-05 13:15:02 +0000 UTC" firstStartedPulling="2025-12-05 13:15:04.312905205 +0000 UTC m=+5188.805619928" lastFinishedPulling="2025-12-05 13:15:10.308156504 +0000 UTC m=+5194.800871227" observedRunningTime="2025-12-05 13:15:11.406515478 +0000 UTC m=+5195.899230201" watchObservedRunningTime="2025-12-05 13:15:11.409862067 +0000 UTC m=+5195.902576790" Dec 05 13:15:12 crc kubenswrapper[4763]: I1205 13:15:12.783964 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:15:12 crc kubenswrapper[4763]: E1205 13:15:12.784734 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:15:12 crc kubenswrapper[4763]: I1205 13:15:12.867476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:12 crc kubenswrapper[4763]: I1205 13:15:12.867931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:13 crc kubenswrapper[4763]: I1205 13:15:13.915195 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-56mb7" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="registry-server" probeResult="failure" output=< Dec 05 13:15:13 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 05 13:15:13 crc kubenswrapper[4763]: > Dec 05 13:15:22 crc kubenswrapper[4763]: I1205 13:15:22.924036 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:22 crc kubenswrapper[4763]: I1205 13:15:22.981909 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:23 crc kubenswrapper[4763]: I1205 13:15:23.170935 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:23 crc kubenswrapper[4763]: I1205 13:15:23.784514 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:15:23 crc kubenswrapper[4763]: E1205 13:15:23.785511 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:15:24 crc kubenswrapper[4763]: I1205 13:15:24.508022 4763 generic.go:334] "Generic (PLEG): container finished" podID="295e994b-9be5-4486-beb7-6be00576c5c3" containerID="800f0f10e1abe21dc6010877c8872277d05b9280c81ed98f5009c55c4f922ab7" exitCode=0 Dec 05 13:15:24 crc kubenswrapper[4763]: I1205 13:15:24.508153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"295e994b-9be5-4486-beb7-6be00576c5c3","Type":"ContainerDied","Data":"800f0f10e1abe21dc6010877c8872277d05b9280c81ed98f5009c55c4f922ab7"} Dec 05 13:15:24 crc kubenswrapper[4763]: I1205 13:15:24.508256 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56mb7" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="registry-server" containerID="cri-o://74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad" gracePeriod=2 Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.020061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.188235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqg27\" (UniqueName: \"kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27\") pod \"39153564-9da5-4b52-881c-db524c3d3634\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.188308 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content\") pod \"39153564-9da5-4b52-881c-db524c3d3634\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.188579 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities\") pod \"39153564-9da5-4b52-881c-db524c3d3634\" (UID: \"39153564-9da5-4b52-881c-db524c3d3634\") " Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.189510 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities" (OuterVolumeSpecName: "utilities") pod "39153564-9da5-4b52-881c-db524c3d3634" (UID: "39153564-9da5-4b52-881c-db524c3d3634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.195014 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27" (OuterVolumeSpecName: "kube-api-access-hqg27") pod "39153564-9da5-4b52-881c-db524c3d3634" (UID: "39153564-9da5-4b52-881c-db524c3d3634"). InnerVolumeSpecName "kube-api-access-hqg27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.291349 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.291382 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqg27\" (UniqueName: \"kubernetes.io/projected/39153564-9da5-4b52-881c-db524c3d3634-kube-api-access-hqg27\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.304732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39153564-9da5-4b52-881c-db524c3d3634" (UID: "39153564-9da5-4b52-881c-db524c3d3634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.392617 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39153564-9da5-4b52-881c-db524c3d3634-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.521815 4763 generic.go:334] "Generic (PLEG): container finished" podID="39153564-9da5-4b52-881c-db524c3d3634" containerID="74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad" exitCode=0 Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.522016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerDied","Data":"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad"} Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.522084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56mb7" event={"ID":"39153564-9da5-4b52-881c-db524c3d3634","Type":"ContainerDied","Data":"691d446f6b498005f52eebe08d6ee5eadf397b8326444b6213f6a69239c01a5d"} Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.522089 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56mb7" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.522122 4763 scope.go:117] "RemoveContainer" containerID="74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.548534 4763 scope.go:117] "RemoveContainer" containerID="ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.583064 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.591363 4763 scope.go:117] "RemoveContainer" containerID="b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.593377 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56mb7"] Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.617876 4763 scope.go:117] "RemoveContainer" containerID="74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad" Dec 05 13:15:25 crc kubenswrapper[4763]: E1205 13:15:25.618385 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad\": container with ID starting with 74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad not found: ID does not exist" containerID="74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.618449 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad"} err="failed to get container status \"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad\": rpc error: code = NotFound desc = could not find container \"74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad\": container with ID starting with 74574d804066d1e27b2625ef885cce94b2afc2caa4b97986e7fac4138fbcf5ad not found: ID does not exist" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.618497 4763 scope.go:117] "RemoveContainer" containerID="ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669" Dec 05 13:15:25 crc kubenswrapper[4763]: E1205 13:15:25.618826 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669\": container with ID starting with ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669 not found: ID does not exist" containerID="ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.618869 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669"} err="failed to get container status \"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669\": rpc error: code = NotFound desc = could not find container \"ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669\": container with ID starting with ee1088e02d4a1d74c2dcc4931949c56f36ab43bf8fee7fca17cd5f297c2a1669 not found: ID does not exist" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.618894 4763 scope.go:117] "RemoveContainer" containerID="b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec" Dec 05 13:15:25 crc kubenswrapper[4763]: E1205 13:15:25.619367 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec\": container with ID starting with b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec not found: ID does not exist" containerID="b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.619398 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec"} err="failed to get container status \"b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec\": rpc error: code = NotFound desc = could not find container \"b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec\": container with ID starting with b9588acffb5090cafd9b803cd94f76b6c9ebae5ca2c11afe9fbfb7da0271fdec not found: ID does not exist" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.824651 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39153564-9da5-4b52-881c-db524c3d3634" path="/var/lib/kubelet/pods/39153564-9da5-4b52-881c-db524c3d3634/volumes" Dec 05 13:15:25 crc kubenswrapper[4763]: I1205 13:15:25.924584 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024450 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jx49\" (UniqueName: \"kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024511 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024569 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024773 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024925 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.024965 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.025009 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir\") pod \"295e994b-9be5-4486-beb7-6be00576c5c3\" (UID: \"295e994b-9be5-4486-beb7-6be00576c5c3\") " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.026151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data" (OuterVolumeSpecName: "config-data") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.026277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.030276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.030425 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49" (OuterVolumeSpecName: "kube-api-access-8jx49") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "kube-api-access-8jx49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.051869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.057042 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.060544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.097676 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.100055 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "295e994b-9be5-4486-beb7-6be00576c5c3" (UID: "295e994b-9be5-4486-beb7-6be00576c5c3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127400 4763 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127438 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jx49\" (UniqueName: \"kubernetes.io/projected/295e994b-9be5-4486-beb7-6be00576c5c3-kube-api-access-8jx49\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127449 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127461 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127470 4763 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/295e994b-9be5-4486-beb7-6be00576c5c3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127480 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127489 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/295e994b-9be5-4486-beb7-6be00576c5c3-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127498 4763 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/295e994b-9be5-4486-beb7-6be00576c5c3-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.127530 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.148243 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.229735 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.539462 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.539465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"295e994b-9be5-4486-beb7-6be00576c5c3","Type":"ContainerDied","Data":"44c4d50882b8243ab7ab046f510d3f526012c7d23e65a4410d8f0c523d8c7a0b"} Dec 05 13:15:26 crc kubenswrapper[4763]: I1205 13:15:26.540105 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c4d50882b8243ab7ab046f510d3f526012c7d23e65a4410d8f0c523d8c7a0b" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.443589 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 13:15:33 crc kubenswrapper[4763]: E1205 13:15:33.445934 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295e994b-9be5-4486-beb7-6be00576c5c3" containerName="tempest-tests-tempest-tests-runner" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.445973 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="295e994b-9be5-4486-beb7-6be00576c5c3" containerName="tempest-tests-tempest-tests-runner" Dec 05 13:15:33 crc kubenswrapper[4763]: E1205 13:15:33.446003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="extract-content" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446015 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="extract-content" Dec 05 13:15:33 crc kubenswrapper[4763]: E1205 13:15:33.446045 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="registry-server" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446055 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="registry-server" Dec 05 13:15:33 crc kubenswrapper[4763]: E1205 13:15:33.446072 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11294be2-8577-4016-8754-baac9ce43eac" containerName="collect-profiles" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446081 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="11294be2-8577-4016-8754-baac9ce43eac" containerName="collect-profiles" Dec 05 13:15:33 crc kubenswrapper[4763]: E1205 13:15:33.446102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="extract-utilities" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446112 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="extract-utilities" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446468 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="11294be2-8577-4016-8754-baac9ce43eac" containerName="collect-profiles" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446488 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="39153564-9da5-4b52-881c-db524c3d3634" containerName="registry-server" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.446525 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="295e994b-9be5-4486-beb7-6be00576c5c3" containerName="tempest-tests-tempest-tests-runner" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.447981 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.452028 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m2cn4" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.456548 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.590957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zldj\" (UniqueName: \"kubernetes.io/projected/9a56e3c4-fdad-4c05-b4f4-9a155afc3239-kube-api-access-9zldj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.591366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.692668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zldj\" (UniqueName: \"kubernetes.io/projected/9a56e3c4-fdad-4c05-b4f4-9a155afc3239-kube-api-access-9zldj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.692831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.693383 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.718951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zldj\" (UniqueName: \"kubernetes.io/projected/9a56e3c4-fdad-4c05-b4f4-9a155afc3239-kube-api-access-9zldj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.733078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a56e3c4-fdad-4c05-b4f4-9a155afc3239\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:33 crc kubenswrapper[4763]: I1205 13:15:33.774385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 13:15:34 crc kubenswrapper[4763]: I1205 13:15:34.298215 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 13:15:34 crc kubenswrapper[4763]: I1205 13:15:34.614522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a56e3c4-fdad-4c05-b4f4-9a155afc3239","Type":"ContainerStarted","Data":"35e33421d8468910922ca4691944474ef1f7d00507da8eb2fbd9bf4a95aa2845"} Dec 05 13:15:34 crc kubenswrapper[4763]: I1205 13:15:34.784681 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:15:34 crc kubenswrapper[4763]: E1205 13:15:34.784958 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:15:36 crc kubenswrapper[4763]: I1205 13:15:36.634706 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a56e3c4-fdad-4c05-b4f4-9a155afc3239","Type":"ContainerStarted","Data":"a80a88bf13120f9742ae2e5b110bb7b13f06ee4d04f4380794079535e4108d9b"} Dec 05 13:15:36 crc kubenswrapper[4763]: I1205 13:15:36.657970 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.50908467 podStartE2EDuration="3.657949186s" podCreationTimestamp="2025-12-05 13:15:33 +0000 UTC" firstStartedPulling="2025-12-05 13:15:34.308303177 +0000 UTC m=+5218.801017920" lastFinishedPulling="2025-12-05 13:15:35.457167693 +0000 UTC m=+5219.949882436" observedRunningTime="2025-12-05 13:15:36.649455001 +0000 UTC m=+5221.142169734" watchObservedRunningTime="2025-12-05 13:15:36.657949186 +0000 UTC m=+5221.150663919" Dec 05 13:15:49 crc kubenswrapper[4763]: I1205 13:15:49.783995 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:15:49 crc kubenswrapper[4763]: E1205 13:15:49.785125 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:15:53 crc kubenswrapper[4763]: I1205 13:15:53.724086 4763 scope.go:117] "RemoveContainer" containerID="a1b3aa28bba6b3c5a8a9ad9e11811b7855d8414725dd2a2653fd824cc534e2d4" Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.928386 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnz8t/must-gather-xstqw"] Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.931726 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.936326 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qnz8t"/"kube-root-ca.crt" Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.937320 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qnz8t"/"default-dockercfg-5rhrs" Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.940560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qnz8t/must-gather-xstqw"] Dec 05 13:15:58 crc kubenswrapper[4763]: I1205 13:15:58.941147 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qnz8t"/"openshift-service-ca.crt" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.061739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpm6\" (UniqueName: \"kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.062153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.163853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.164176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpm6\" (UniqueName: \"kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.164673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.191595 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpm6\" (UniqueName: \"kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6\") pod \"must-gather-xstqw\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.249122 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.707456 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qnz8t/must-gather-xstqw"] Dec 05 13:15:59 crc kubenswrapper[4763]: I1205 13:15:59.856848 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/must-gather-xstqw" event={"ID":"95c5d8b4-59ac-42ec-971d-efef222bf2ae","Type":"ContainerStarted","Data":"dd9686d02ab552e7e7b65ecf25428542dbdf30e48212b69b3cdc09b33e5c6472"} Dec 05 13:16:03 crc kubenswrapper[4763]: I1205 13:16:03.784933 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:16:03 crc kubenswrapper[4763]: E1205 13:16:03.785908 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:16:05 crc kubenswrapper[4763]: I1205 13:16:05.915895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/must-gather-xstqw" event={"ID":"95c5d8b4-59ac-42ec-971d-efef222bf2ae","Type":"ContainerStarted","Data":"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2"} Dec 05 13:16:06 crc kubenswrapper[4763]: I1205 13:16:06.931646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/must-gather-xstqw" event={"ID":"95c5d8b4-59ac-42ec-971d-efef222bf2ae","Type":"ContainerStarted","Data":"7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f"} Dec 05 13:16:06 crc kubenswrapper[4763]: I1205 13:16:06.959606 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qnz8t/must-gather-xstqw" podStartSLOduration=3.339696215 podStartE2EDuration="8.959584087s" podCreationTimestamp="2025-12-05 13:15:58 +0000 UTC" firstStartedPulling="2025-12-05 13:15:59.706776918 +0000 UTC m=+5244.199491641" lastFinishedPulling="2025-12-05 13:16:05.32666479 +0000 UTC m=+5249.819379513" observedRunningTime="2025-12-05 13:16:06.946532677 +0000 UTC m=+5251.439247400" watchObservedRunningTime="2025-12-05 13:16:06.959584087 +0000 UTC m=+5251.452298820" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.164655 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-wj6wd"] Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.166609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.307823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.308129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss69c\" (UniqueName: \"kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.414818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.414888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss69c\" (UniqueName: \"kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.415565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.459886 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss69c\" (UniqueName: \"kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c\") pod \"crc-debug-wj6wd\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.487815 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:16:10 crc kubenswrapper[4763]: W1205 13:16:10.534031 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e4faec2_6c91_4a1c_87c7_623b631dc09b.slice/crio-057f5f2d3544a92fa700c1cd4c917c40d6c55767c4a89eadce1b8066b44866a7 WatchSource:0}: Error finding container 057f5f2d3544a92fa700c1cd4c917c40d6c55767c4a89eadce1b8066b44866a7: Status 404 returned error can't find the container with id 057f5f2d3544a92fa700c1cd4c917c40d6c55767c4a89eadce1b8066b44866a7 Dec 05 13:16:10 crc kubenswrapper[4763]: I1205 13:16:10.968402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" event={"ID":"4e4faec2-6c91-4a1c-87c7-623b631dc09b","Type":"ContainerStarted","Data":"057f5f2d3544a92fa700c1cd4c917c40d6c55767c4a89eadce1b8066b44866a7"} Dec 05 13:16:16 crc kubenswrapper[4763]: I1205 13:16:16.784578 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:16:16 crc kubenswrapper[4763]: E1205 13:16:16.785492 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:16:24 crc kubenswrapper[4763]: I1205 13:16:24.110261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" event={"ID":"4e4faec2-6c91-4a1c-87c7-623b631dc09b","Type":"ContainerStarted","Data":"be10ad6cab6cc2485f7b4081d6d936b05aa7f5db3bf50dad616375ddfbf05ae5"} Dec 05 13:16:24 crc kubenswrapper[4763]: I1205 13:16:24.135012 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" podStartSLOduration=1.510881804 podStartE2EDuration="14.134990583s" podCreationTimestamp="2025-12-05 13:16:10 +0000 UTC" firstStartedPulling="2025-12-05 13:16:10.536318213 +0000 UTC m=+5255.029032946" lastFinishedPulling="2025-12-05 13:16:23.160427002 +0000 UTC m=+5267.653141725" observedRunningTime="2025-12-05 13:16:24.128596071 +0000 UTC m=+5268.621310804" watchObservedRunningTime="2025-12-05 13:16:24.134990583 +0000 UTC m=+5268.627705306" Dec 05 13:16:31 crc kubenswrapper[4763]: I1205 13:16:31.785471 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:16:31 crc kubenswrapper[4763]: E1205 13:16:31.786346 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:16:42 crc kubenswrapper[4763]: I1205 13:16:42.784193 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:16:42 crc kubenswrapper[4763]: E1205 13:16:42.785077 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:16:56 crc kubenswrapper[4763]: I1205 13:16:56.784892 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:16:56 crc kubenswrapper[4763]: E1205 13:16:56.785510 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:17:07 crc kubenswrapper[4763]: I1205 13:17:07.784606 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:17:07 crc kubenswrapper[4763]: E1205 13:17:07.786566 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:17:09 crc kubenswrapper[4763]: I1205 13:17:09.720996 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qv5hl" podUID="cfa0736f-2856-4cfd-810f-d8fcd2bea7f6" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.58:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:17:11 crc kubenswrapper[4763]: I1205 13:17:11.998989 4763 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 6.386705698s: [/var/lib/containers/storage/overlay/70e42d9d4c375a7e7012856fbf8b41e4710b8ad0be4ded64bd7b84273f4f13a4/diff /var/log/pods/openshift-must-gather-qnz8t_must-gather-xstqw_95c5d8b4-59ac-42ec-971d-efef222bf2ae/gather/0.log]; will not log again for this container unless duration exceeds 2s Dec 05 13:17:18 crc kubenswrapper[4763]: I1205 13:17:18.783987 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:17:18 crc kubenswrapper[4763]: E1205 13:17:18.784873 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:17:33 crc kubenswrapper[4763]: I1205 13:17:33.789002 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:17:33 crc kubenswrapper[4763]: E1205 13:17:33.789788 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:17:45 crc kubenswrapper[4763]: I1205 13:17:45.977811 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e4faec2-6c91-4a1c-87c7-623b631dc09b" containerID="be10ad6cab6cc2485f7b4081d6d936b05aa7f5db3bf50dad616375ddfbf05ae5" exitCode=0 Dec 05 13:17:45 crc kubenswrapper[4763]: I1205 13:17:45.977926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" event={"ID":"4e4faec2-6c91-4a1c-87c7-623b631dc09b","Type":"ContainerDied","Data":"be10ad6cab6cc2485f7b4081d6d936b05aa7f5db3bf50dad616375ddfbf05ae5"} Dec 05 13:17:46 crc kubenswrapper[4763]: I1205 13:17:46.784825 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:17:46 crc kubenswrapper[4763]: E1205 13:17:46.785096 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.086982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.123896 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-wj6wd"] Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.133698 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-wj6wd"] Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.225908 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss69c\" (UniqueName: \"kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c\") pod \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.226233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host\") pod \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\" (UID: \"4e4faec2-6c91-4a1c-87c7-623b631dc09b\") " Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.226342 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host" (OuterVolumeSpecName: "host") pod "4e4faec2-6c91-4a1c-87c7-623b631dc09b" (UID: "4e4faec2-6c91-4a1c-87c7-623b631dc09b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.227161 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e4faec2-6c91-4a1c-87c7-623b631dc09b-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.237626 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c" (OuterVolumeSpecName: "kube-api-access-ss69c") pod "4e4faec2-6c91-4a1c-87c7-623b631dc09b" (UID: "4e4faec2-6c91-4a1c-87c7-623b631dc09b"). InnerVolumeSpecName "kube-api-access-ss69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.329687 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss69c\" (UniqueName: \"kubernetes.io/projected/4e4faec2-6c91-4a1c-87c7-623b631dc09b-kube-api-access-ss69c\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:47 crc kubenswrapper[4763]: I1205 13:17:47.797173 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e4faec2-6c91-4a1c-87c7-623b631dc09b" path="/var/lib/kubelet/pods/4e4faec2-6c91-4a1c-87c7-623b631dc09b/volumes" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.002446 4763 scope.go:117] "RemoveContainer" containerID="be10ad6cab6cc2485f7b4081d6d936b05aa7f5db3bf50dad616375ddfbf05ae5" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.002477 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-wj6wd" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.329430 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-8xfdw"] Dec 05 13:17:48 crc kubenswrapper[4763]: E1205 13:17:48.330495 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4faec2-6c91-4a1c-87c7-623b631dc09b" containerName="container-00" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.330520 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4faec2-6c91-4a1c-87c7-623b631dc09b" containerName="container-00" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.330785 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4faec2-6c91-4a1c-87c7-623b631dc09b" containerName="container-00" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.331476 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.451455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.451500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk4h5\" (UniqueName: \"kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.553684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.553743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk4h5\" (UniqueName: \"kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.553834 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.571566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk4h5\" (UniqueName: \"kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5\") pod \"crc-debug-8xfdw\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:48 crc kubenswrapper[4763]: I1205 13:17:48.648257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:49 crc kubenswrapper[4763]: I1205 13:17:49.012187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" event={"ID":"5510f98e-7d37-4465-8058-d32a22eabf05","Type":"ContainerStarted","Data":"ef98ef5fb67ceb9be04a56359ed8bbaa95f4ab2047b2a7ff5ed0253bcbc7f426"} Dec 05 13:17:50 crc kubenswrapper[4763]: I1205 13:17:50.024338 4763 generic.go:334] "Generic (PLEG): container finished" podID="5510f98e-7d37-4465-8058-d32a22eabf05" containerID="9970d0d66cd57dbcb398268322374a193ed6615a7278dfc54831cf1ea5f9aa77" exitCode=0 Dec 05 13:17:50 crc kubenswrapper[4763]: I1205 13:17:50.024400 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" event={"ID":"5510f98e-7d37-4465-8058-d32a22eabf05","Type":"ContainerDied","Data":"9970d0d66cd57dbcb398268322374a193ed6615a7278dfc54831cf1ea5f9aa77"} Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.151996 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.306584 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk4h5\" (UniqueName: \"kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5\") pod \"5510f98e-7d37-4465-8058-d32a22eabf05\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.306773 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host\") pod \"5510f98e-7d37-4465-8058-d32a22eabf05\" (UID: \"5510f98e-7d37-4465-8058-d32a22eabf05\") " Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.307264 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host" (OuterVolumeSpecName: "host") pod "5510f98e-7d37-4465-8058-d32a22eabf05" (UID: "5510f98e-7d37-4465-8058-d32a22eabf05"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.314581 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5" (OuterVolumeSpecName: "kube-api-access-mk4h5") pod "5510f98e-7d37-4465-8058-d32a22eabf05" (UID: "5510f98e-7d37-4465-8058-d32a22eabf05"). InnerVolumeSpecName "kube-api-access-mk4h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.408636 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5510f98e-7d37-4465-8058-d32a22eabf05-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:51 crc kubenswrapper[4763]: I1205 13:17:51.408675 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk4h5\" (UniqueName: \"kubernetes.io/projected/5510f98e-7d37-4465-8058-d32a22eabf05-kube-api-access-mk4h5\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:52 crc kubenswrapper[4763]: I1205 13:17:52.051538 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" Dec 05 13:17:52 crc kubenswrapper[4763]: I1205 13:17:52.224464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-8xfdw" event={"ID":"5510f98e-7d37-4465-8058-d32a22eabf05","Type":"ContainerDied","Data":"ef98ef5fb67ceb9be04a56359ed8bbaa95f4ab2047b2a7ff5ed0253bcbc7f426"} Dec 05 13:17:52 crc kubenswrapper[4763]: I1205 13:17:52.224983 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef98ef5fb67ceb9be04a56359ed8bbaa95f4ab2047b2a7ff5ed0253bcbc7f426" Dec 05 13:17:52 crc kubenswrapper[4763]: I1205 13:17:52.727726 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-8xfdw"] Dec 05 13:17:52 crc kubenswrapper[4763]: I1205 13:17:52.740140 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-8xfdw"] Dec 05 13:17:53 crc kubenswrapper[4763]: I1205 13:17:53.794911 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5510f98e-7d37-4465-8058-d32a22eabf05" path="/var/lib/kubelet/pods/5510f98e-7d37-4465-8058-d32a22eabf05/volumes" Dec 05 13:17:53 crc kubenswrapper[4763]: I1205 13:17:53.978740 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-76qsd"] Dec 05 13:17:53 crc kubenswrapper[4763]: E1205 13:17:53.979164 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5510f98e-7d37-4465-8058-d32a22eabf05" containerName="container-00" Dec 05 13:17:53 crc kubenswrapper[4763]: I1205 13:17:53.979178 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5510f98e-7d37-4465-8058-d32a22eabf05" containerName="container-00" Dec 05 13:17:53 crc kubenswrapper[4763]: I1205 13:17:53.979343 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5510f98e-7d37-4465-8058-d32a22eabf05" containerName="container-00" Dec 05 13:17:53 crc kubenswrapper[4763]: I1205 13:17:53.980022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.060786 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqqj5\" (UniqueName: \"kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.060839 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.165025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqqj5\" (UniqueName: \"kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.165078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.165201 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.199514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqqj5\" (UniqueName: \"kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5\") pod \"crc-debug-76qsd\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:54 crc kubenswrapper[4763]: I1205 13:17:54.308190 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:55 crc kubenswrapper[4763]: I1205 13:17:55.084136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" event={"ID":"1a1ef54e-2852-4229-9ffb-0cc470a48891","Type":"ContainerStarted","Data":"9fc65e753fbda2a0fb2fcc65b69cba508c5b391e8349c9f1620eccaf449b65be"} Dec 05 13:17:58 crc kubenswrapper[4763]: I1205 13:17:58.125634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" event={"ID":"1a1ef54e-2852-4229-9ffb-0cc470a48891","Type":"ContainerStarted","Data":"31fc803381a100a5405be50f24d4503b08d851ab00b6c66cf5d7e917c2011bcd"} Dec 05 13:17:58 crc kubenswrapper[4763]: I1205 13:17:58.170595 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-76qsd"] Dec 05 13:17:58 crc kubenswrapper[4763]: I1205 13:17:58.180856 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnz8t/crc-debug-76qsd"] Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.135596 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a1ef54e-2852-4229-9ffb-0cc470a48891" containerID="31fc803381a100a5405be50f24d4503b08d851ab00b6c66cf5d7e917c2011bcd" exitCode=0 Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.259583 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.383127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host\") pod \"1a1ef54e-2852-4229-9ffb-0cc470a48891\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.383256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqqj5\" (UniqueName: \"kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5\") pod \"1a1ef54e-2852-4229-9ffb-0cc470a48891\" (UID: \"1a1ef54e-2852-4229-9ffb-0cc470a48891\") " Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.383289 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host" (OuterVolumeSpecName: "host") pod "1a1ef54e-2852-4229-9ffb-0cc470a48891" (UID: "1a1ef54e-2852-4229-9ffb-0cc470a48891"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.383954 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a1ef54e-2852-4229-9ffb-0cc470a48891-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.389837 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5" (OuterVolumeSpecName: "kube-api-access-nqqj5") pod "1a1ef54e-2852-4229-9ffb-0cc470a48891" (UID: "1a1ef54e-2852-4229-9ffb-0cc470a48891"). InnerVolumeSpecName "kube-api-access-nqqj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.486247 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqqj5\" (UniqueName: \"kubernetes.io/projected/1a1ef54e-2852-4229-9ffb-0cc470a48891-kube-api-access-nqqj5\") on node \"crc\" DevicePath \"\"" Dec 05 13:17:59 crc kubenswrapper[4763]: I1205 13:17:59.793617 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1ef54e-2852-4229-9ffb-0cc470a48891" path="/var/lib/kubelet/pods/1a1ef54e-2852-4229-9ffb-0cc470a48891/volumes" Dec 05 13:18:00 crc kubenswrapper[4763]: I1205 13:18:00.170378 4763 scope.go:117] "RemoveContainer" containerID="31fc803381a100a5405be50f24d4503b08d851ab00b6c66cf5d7e917c2011bcd" Dec 05 13:18:00 crc kubenswrapper[4763]: I1205 13:18:00.170401 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/crc-debug-76qsd" Dec 05 13:18:00 crc kubenswrapper[4763]: I1205 13:18:00.784049 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:18:00 crc kubenswrapper[4763]: E1205 13:18:00.784627 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:18:12 crc kubenswrapper[4763]: I1205 13:18:12.783923 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:18:14 crc kubenswrapper[4763]: I1205 13:18:14.368091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068"} Dec 05 13:18:27 crc kubenswrapper[4763]: I1205 13:18:27.699133 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f846788f8-4gznp_80c384c8-1d13-47af-b978-f724e40e99af/barbican-api/0.log" Dec 05 13:18:27 crc kubenswrapper[4763]: I1205 13:18:27.752176 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f846788f8-4gznp_80c384c8-1d13-47af-b978-f724e40e99af/barbican-api-log/0.log" Dec 05 13:18:27 crc kubenswrapper[4763]: I1205 13:18:27.904854 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-94d865894-tqt5m_76cf4acb-9763-4dac-9a2f-eba4a98314f0/barbican-keystone-listener/0.log" Dec 05 13:18:27 crc kubenswrapper[4763]: I1205 13:18:27.961299 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5446b6d8dc-p784q_d23af5c6-295f-4c65-90a1-02e66a41f325/barbican-worker/0.log" Dec 05 13:18:27 crc kubenswrapper[4763]: I1205 13:18:27.990057 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-94d865894-tqt5m_76cf4acb-9763-4dac-9a2f-eba4a98314f0/barbican-keystone-listener-log/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.090045 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5446b6d8dc-p784q_d23af5c6-295f-4c65-90a1-02e66a41f325/barbican-worker-log/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.243953 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2_a28052ee-43d2-4618-a981-ef115a2c3a00/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.353406 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/ceilometer-central-agent/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.469047 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/proxy-httpd/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.471693 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/ceilometer-notification-agent/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.511855 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/sg-core/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.733443 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4027bf13-4c83-4281-8a0d-d18c6032e0af/cinder-api-log/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.735224 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4027bf13-4c83-4281-8a0d-d18c6032e0af/cinder-api/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.883689 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd14a5e3-8def-4bc7-b375-8ae87dd75838/cinder-scheduler/0.log" Dec 05 13:18:28 crc kubenswrapper[4763]: I1205 13:18:28.972468 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd14a5e3-8def-4bc7-b375-8ae87dd75838/probe/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.071062 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr_e3bafcb4-8ef9-4670-8202-f5c61d6d4c33/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.207514 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gzt69_9169edb0-a8a3-4953-8472-6e496fced2e6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.296684 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/init/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.637218 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/dnsmasq-dns/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.676247 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/init/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.768728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh_8c7e581d-5684-4557-96f8-5502a00e1da1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.882348 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_29573883-0e6d-40b3-9a6f-39308d6db246/glance-httpd/0.log" Dec 05 13:18:29 crc kubenswrapper[4763]: I1205 13:18:29.937945 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_29573883-0e6d-40b3-9a6f-39308d6db246/glance-log/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.119154 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cdedbb4d-c325-420f-946f-942359580cfe/glance-httpd/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.154149 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cdedbb4d-c325-420f-946f-942359580cfe/glance-log/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.364358 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77dcd5c496-hs7bj_b34428a2-5423-401a-b7d3-aebd1d070945/horizon/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.466927 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9_5d234676-63b0-4c1c-804f-93d938e0ed84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.713555 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-k85hd_4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.915714 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77dcd5c496-hs7bj_b34428a2-5423-401a-b7d3-aebd1d070945/horizon-log/0.log" Dec 05 13:18:30 crc kubenswrapper[4763]: I1205 13:18:30.968048 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415661-qdvsb_9c1231bc-00fe-4fb3-9fb7-7121743e17c9/keystone-cron/0.log" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.139548 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7fc67b9475-mqldq_2d728472-3cda-480b-b5dc-065969434f7d/keystone-api/0.log" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.198530 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4521fb51-39ad-4717-8239-8d2a759d4a30/kube-state-metrics/0.log" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.262958 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4_4355ed47-63c1-47e1-81e6-33d33f89b5a7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.330824 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:31 crc kubenswrapper[4763]: E1205 13:18:31.331283 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1ef54e-2852-4229-9ffb-0cc470a48891" containerName="container-00" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.331304 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1ef54e-2852-4229-9ffb-0cc470a48891" containerName="container-00" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.331514 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1ef54e-2852-4229-9ffb-0cc470a48891" containerName="container-00" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.333152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.352324 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.449181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.449462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.449573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgp4k\" (UniqueName: \"kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.550906 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgp4k\" (UniqueName: \"kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.550996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.551037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.551738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.551922 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.579866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgp4k\" (UniqueName: \"kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k\") pod \"community-operators-s6pzv\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.651008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.795957 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b55c974d9-brgnw_6d6c980e-688d-41b3-a7ad-0061b07b9494/neutron-httpd/0.log" Dec 05 13:18:31 crc kubenswrapper[4763]: I1205 13:18:31.930120 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b55c974d9-brgnw_6d6c980e-688d-41b3-a7ad-0061b07b9494/neutron-api/0.log" Dec 05 13:18:32 crc kubenswrapper[4763]: I1205 13:18:32.153827 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5_d790dbae-6bb4-4b37-b9bd-0ba454c8fa83/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:32 crc kubenswrapper[4763]: I1205 13:18:32.258636 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.025078 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4319bf0f-65c6-401b-96dd-53e10a73c011/nova-cell0-conductor-conductor/0.log" Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.374112 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0f31e12e-2f94-40a3-a522-1aa44cb1cdbf/nova-cell1-conductor-conductor/0.log" Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.404428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2ad0e748-bb1a-4b4f-bc70-f059e4fc3614/nova-api-api/0.log" Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.554810 4763 generic.go:334] "Generic (PLEG): container finished" podID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerID="b76813082bcc669be34718bbd8d5f7e27c1e7a69d7cb17baf9cf5b63d6001c45" exitCode=0 Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.554882 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerDied","Data":"b76813082bcc669be34718bbd8d5f7e27c1e7a69d7cb17baf9cf5b63d6001c45"} Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.554911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerStarted","Data":"a458f6eb23850259c91e6f572742cbe777a993d999854c1080776ef2cb30b307"} Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.693149 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c3577198-f0dd-4145-a9c5-d29a0d18d212/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.743272 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mrrcd_e3335529-4636-46d2-b949-1d02a4c43ee0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:33 crc kubenswrapper[4763]: I1205 13:18:33.776974 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2ad0e748-bb1a-4b4f-bc70-f059e4fc3614/nova-api-log/0.log" Dec 05 13:18:34 crc kubenswrapper[4763]: I1205 13:18:34.063573 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb8fec18-c1b6-47de-91cc-7ef68caceb0e/nova-metadata-log/0.log" Dec 05 13:18:34 crc kubenswrapper[4763]: I1205 13:18:34.417594 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e011a0f4-fec9-4c12-a229-2e63ef03037d/nova-scheduler-scheduler/0.log" Dec 05 13:18:34 crc kubenswrapper[4763]: I1205 13:18:34.988509 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/mysql-bootstrap/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.220217 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/mysql-bootstrap/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.230742 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/galera/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.432068 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/mysql-bootstrap/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.574219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerStarted","Data":"786db9cbd2337232319436f5f8637e6902a0bdbc488d9e2a02c26b04b42a2029"} Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.654659 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/mysql-bootstrap/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.716911 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/galera/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.892851 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9e894c53-51db-4ede-9730-b8c68ad6fc15/openstackclient/0.log" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.913848 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.916250 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.922291 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:35 crc kubenswrapper[4763]: I1205 13:18:35.969725 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6gw4w_c9acbf99-ec01-4de6-9d45-418664511586/ovn-controller/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.007722 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb8fec18-c1b6-47de-91cc-7ef68caceb0e/nova-metadata-metadata/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.052553 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nj6s\" (UniqueName: \"kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.052896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.053045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.154873 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nj6s\" (UniqueName: \"kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.154961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.154996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.155555 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.156162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.175639 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nj6s\" (UniqueName: \"kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s\") pod \"redhat-marketplace-n46f2\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.248381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.397547 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-87zsg_08357e6b-d21e-4b50-8af2-22ddc7398fbc/openstack-network-exporter/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.525985 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server-init/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.526165 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server-init/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.580639 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovs-vswitchd/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.640623 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.754927 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.836428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sl4g2_688c0399-83be-44e3-adc0-4288525a9f4b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:36 crc kubenswrapper[4763]: I1205 13:18:36.904728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c/openstack-network-exporter/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.066656 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c/ovn-northd/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.085471 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9680c542-fe6f-42cb-b48d-e17b80916e50/openstack-network-exporter/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.169597 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9680c542-fe6f-42cb-b48d-e17b80916e50/ovsdbserver-nb/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.312324 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1/openstack-network-exporter/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.405697 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1/ovsdbserver-sb/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.618863 4763 generic.go:334] "Generic (PLEG): container finished" podID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerID="786db9cbd2337232319436f5f8637e6902a0bdbc488d9e2a02c26b04b42a2029" exitCode=0 Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.618924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerDied","Data":"786db9cbd2337232319436f5f8637e6902a0bdbc488d9e2a02c26b04b42a2029"} Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.621531 4763 generic.go:334] "Generic (PLEG): container finished" podID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerID="57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541" exitCode=0 Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.621594 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerDied","Data":"57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541"} Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.621624 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerStarted","Data":"d3a281e0090bb17de51f18ea3a7cfbe9225a267518cd89521b25d7e1588c8631"} Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.740411 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/init-config-reloader/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.833425 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76d66464d-r24j6_7e5e41f7-ee3c-4587-bae0-5716c12c84b6/placement-api/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.878688 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76d66464d-r24j6_7e5e41f7-ee3c-4587-bae0-5716c12c84b6/placement-log/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.897473 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/init-config-reloader/0.log" Dec 05 13:18:37 crc kubenswrapper[4763]: I1205 13:18:37.943683 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/config-reloader/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.087098 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/prometheus/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.134090 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/thanos-sidecar/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.178325 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/setup-container/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.328723 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.330864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.371575 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.410867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.410963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlgv\" (UniqueName: \"kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.411078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.432717 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/setup-container/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.513580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.513885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlgv\" (UniqueName: \"kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.513979 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.514104 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/rabbitmq/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.514302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.514388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.537843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlgv\" (UniqueName: \"kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv\") pod \"certified-operators-b72g6\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.610960 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/setup-container/0.log" Dec 05 13:18:38 crc kubenswrapper[4763]: I1205 13:18:38.680242 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.012457 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/rabbitmq/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.046086 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/setup-container/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.210324 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4_576fa469-1138-4580-b637-66ec5a5e101e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.272446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.455682 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-56jk4_783891d5-537e-4f2f-b3ee-326588a913f6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.497440 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz_fd34c478-732e-49a0-ab4a-c35fdf054b3c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.643182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerStarted","Data":"f4c26e861df67186bc59ebfbd94630a44ba8bc4d9a496327cd37d879f1f55330"} Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.643453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerStarted","Data":"93690f7e97a0fa9dcf5ed249c434e1b73802f4d5842cb88adedf498f5798b54f"} Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.645986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerStarted","Data":"0eae43a0cc6d0e26b459eb41e74d2cb8b1bdd8dcae10416db7b0405a499a7626"} Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.668480 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6pzv" podStartSLOduration=3.945056831 podStartE2EDuration="8.668458936s" podCreationTimestamp="2025-12-05 13:18:31 +0000 UTC" firstStartedPulling="2025-12-05 13:18:33.556967784 +0000 UTC m=+5398.049682507" lastFinishedPulling="2025-12-05 13:18:38.280369889 +0000 UTC m=+5402.773084612" observedRunningTime="2025-12-05 13:18:39.662904387 +0000 UTC m=+5404.155619110" watchObservedRunningTime="2025-12-05 13:18:39.668458936 +0000 UTC m=+5404.161173659" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.751565 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7r8pv_b73d98c2-daca-4632-9c2e-1ab408ec4ac5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:39 crc kubenswrapper[4763]: I1205 13:18:39.795219 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gtkd8_23d776eb-9b6f-439e-8938-2aea4708e154/ssh-known-hosts-edpm-deployment/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.025663 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4499b47f-s4mh4_fe2a82f8-601f-42ea-a495-4d1a03084267/proxy-server/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.261633 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pzxb5_a38e41f6-6247-4c91-abba-0bc65d1c2127/swift-ring-rebalance/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.288540 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4499b47f-s4mh4_fe2a82f8-601f-42ea-a495-4d1a03084267/proxy-httpd/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.385991 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-auditor/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.483208 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-reaper/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.649596 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-auditor/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.661444 4763 generic.go:334] "Generic (PLEG): container finished" podID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerID="2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5" exitCode=0 Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.661812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerDied","Data":"2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5"} Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.666433 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d61dd45-0ea0-4581-9325-b42539c14249" containerID="f4c26e861df67186bc59ebfbd94630a44ba8bc4d9a496327cd37d879f1f55330" exitCode=0 Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.666478 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerDied","Data":"f4c26e861df67186bc59ebfbd94630a44ba8bc4d9a496327cd37d879f1f55330"} Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.683311 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-server/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.707093 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-replicator/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.792778 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-replicator/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.930398 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-server/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.940950 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-updater/0.log" Dec 05 13:18:40 crc kubenswrapper[4763]: I1205 13:18:40.992141 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-auditor/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.054537 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-expirer/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.174718 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-server/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.227085 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-replicator/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.232792 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-updater/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.235231 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/rsync/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.443313 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/swift-recon-cron/0.log" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.655168 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.655231 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.716449 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:41 crc kubenswrapper[4763]: I1205 13:18:41.868896 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9a56e3c4-fdad-4c05-b4f4-9a155afc3239/test-operator-logs-container/0.log" Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.029826 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_295e994b-9be5-4486-beb7-6be00576c5c3/tempest-tests-tempest-tests-runner/0.log" Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.043426 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm_f5d27328-7e5a-4664-9c0a-ae5c063ec8b9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.380753 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cc4cab9d-2172-424c-88ca-962ec052d0c3/memcached/0.log" Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.582168 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9_1ed4f328-73dd-4e34-91c4-b68898c59d74/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.700780 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerStarted","Data":"4ac4c8d59fe1ba3752eb306c70b5420ecfbd33fe29e695f69e85f2321702cef6"} Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.703491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerStarted","Data":"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88"} Dec 05 13:18:42 crc kubenswrapper[4763]: I1205 13:18:42.751799 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n46f2" podStartSLOduration=3.192686897 podStartE2EDuration="7.751777392s" podCreationTimestamp="2025-12-05 13:18:35 +0000 UTC" firstStartedPulling="2025-12-05 13:18:37.624088119 +0000 UTC m=+5402.116802842" lastFinishedPulling="2025-12-05 13:18:42.183178624 +0000 UTC m=+5406.675893337" observedRunningTime="2025-12-05 13:18:42.743375806 +0000 UTC m=+5407.236090529" watchObservedRunningTime="2025-12-05 13:18:42.751777392 +0000 UTC m=+5407.244492115" Dec 05 13:18:43 crc kubenswrapper[4763]: I1205 13:18:43.298088 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe/watcher-applier/0.log" Dec 05 13:18:44 crc kubenswrapper[4763]: I1205 13:18:44.157403 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_37a4b06b-53bd-4f53-89b7-4d5a53554510/watcher-api-log/0.log" Dec 05 13:18:44 crc kubenswrapper[4763]: I1205 13:18:44.261456 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_819c6a72-3e2a-4445-8abf-1a10f8eaab9b/watcher-decision-engine/0.log" Dec 05 13:18:45 crc kubenswrapper[4763]: I1205 13:18:45.750835 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d61dd45-0ea0-4581-9325-b42539c14249" containerID="4ac4c8d59fe1ba3752eb306c70b5420ecfbd33fe29e695f69e85f2321702cef6" exitCode=0 Dec 05 13:18:45 crc kubenswrapper[4763]: I1205 13:18:45.751463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerDied","Data":"4ac4c8d59fe1ba3752eb306c70b5420ecfbd33fe29e695f69e85f2321702cef6"} Dec 05 13:18:46 crc kubenswrapper[4763]: I1205 13:18:46.188823 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_37a4b06b-53bd-4f53-89b7-4d5a53554510/watcher-api/0.log" Dec 05 13:18:46 crc kubenswrapper[4763]: I1205 13:18:46.248592 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:46 crc kubenswrapper[4763]: I1205 13:18:46.248663 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:46 crc kubenswrapper[4763]: I1205 13:18:46.299957 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:47 crc kubenswrapper[4763]: I1205 13:18:47.781678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerStarted","Data":"10a873104d0e5c0653b5ad890c970f21c0124cca6f999df445a726c6b6851a04"} Dec 05 13:18:47 crc kubenswrapper[4763]: I1205 13:18:47.809443 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b72g6" podStartSLOduration=4.30040299 podStartE2EDuration="9.809425976s" podCreationTimestamp="2025-12-05 13:18:38 +0000 UTC" firstStartedPulling="2025-12-05 13:18:40.667864023 +0000 UTC m=+5405.160578746" lastFinishedPulling="2025-12-05 13:18:46.176887009 +0000 UTC m=+5410.669601732" observedRunningTime="2025-12-05 13:18:47.801648448 +0000 UTC m=+5412.294363171" watchObservedRunningTime="2025-12-05 13:18:47.809425976 +0000 UTC m=+5412.302140699" Dec 05 13:18:47 crc kubenswrapper[4763]: I1205 13:18:47.841403 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:48 crc kubenswrapper[4763]: I1205 13:18:48.681064 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:48 crc kubenswrapper[4763]: I1205 13:18:48.681110 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:48 crc kubenswrapper[4763]: I1205 13:18:48.703172 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:48 crc kubenswrapper[4763]: I1205 13:18:48.734028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:49 crc kubenswrapper[4763]: I1205 13:18:49.797663 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n46f2" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="registry-server" containerID="cri-o://d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88" gracePeriod=2 Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.382038 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.579390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nj6s\" (UniqueName: \"kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s\") pod \"51dd044c-e4e0-4273-9af1-f30eb7139f03\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.579635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities\") pod \"51dd044c-e4e0-4273-9af1-f30eb7139f03\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.579671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content\") pod \"51dd044c-e4e0-4273-9af1-f30eb7139f03\" (UID: \"51dd044c-e4e0-4273-9af1-f30eb7139f03\") " Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.580203 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities" (OuterVolumeSpecName: "utilities") pod "51dd044c-e4e0-4273-9af1-f30eb7139f03" (UID: "51dd044c-e4e0-4273-9af1-f30eb7139f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.586521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s" (OuterVolumeSpecName: "kube-api-access-9nj6s") pod "51dd044c-e4e0-4273-9af1-f30eb7139f03" (UID: "51dd044c-e4e0-4273-9af1-f30eb7139f03"). InnerVolumeSpecName "kube-api-access-9nj6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.595624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51dd044c-e4e0-4273-9af1-f30eb7139f03" (UID: "51dd044c-e4e0-4273-9af1-f30eb7139f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.681863 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.681907 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dd044c-e4e0-4273-9af1-f30eb7139f03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.681920 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nj6s\" (UniqueName: \"kubernetes.io/projected/51dd044c-e4e0-4273-9af1-f30eb7139f03-kube-api-access-9nj6s\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.809507 4763 generic.go:334] "Generic (PLEG): container finished" podID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerID="d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88" exitCode=0 Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.809873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerDied","Data":"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88"} Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.809904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n46f2" event={"ID":"51dd044c-e4e0-4273-9af1-f30eb7139f03","Type":"ContainerDied","Data":"d3a281e0090bb17de51f18ea3a7cfbe9225a267518cd89521b25d7e1588c8631"} Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.809924 4763 scope.go:117] "RemoveContainer" containerID="d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.810078 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n46f2" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.841993 4763 scope.go:117] "RemoveContainer" containerID="2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.849383 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.861100 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n46f2"] Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.875096 4763 scope.go:117] "RemoveContainer" containerID="57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.917896 4763 scope.go:117] "RemoveContainer" containerID="d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88" Dec 05 13:18:50 crc kubenswrapper[4763]: E1205 13:18:50.918570 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88\": container with ID starting with d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88 not found: ID does not exist" containerID="d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.918633 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88"} err="failed to get container status \"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88\": rpc error: code = NotFound desc = could not find container \"d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88\": container with ID starting with d46cc0809d66b25f0c9c4cf146924487d35c157c1d09139ff7a14c80fec2bb88 not found: ID does not exist" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.918674 4763 scope.go:117] "RemoveContainer" containerID="2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5" Dec 05 13:18:50 crc kubenswrapper[4763]: E1205 13:18:50.919150 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5\": container with ID starting with 2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5 not found: ID does not exist" containerID="2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.919207 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5"} err="failed to get container status \"2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5\": rpc error: code = NotFound desc = could not find container \"2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5\": container with ID starting with 2ea4b6741f8dd49deb5cc1dd1da1ccfa6e81f6e406902cb1aacbcd78decb74e5 not found: ID does not exist" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.919243 4763 scope.go:117] "RemoveContainer" containerID="57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541" Dec 05 13:18:50 crc kubenswrapper[4763]: E1205 13:18:50.919631 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541\": container with ID starting with 57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541 not found: ID does not exist" containerID="57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541" Dec 05 13:18:50 crc kubenswrapper[4763]: I1205 13:18:50.919669 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541"} err="failed to get container status \"57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541\": rpc error: code = NotFound desc = could not find container \"57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541\": container with ID starting with 57f806d8ab44d52f85dd1ce73ccea26e5f3c1c31407dfa3640bb9ae519905541 not found: ID does not exist" Dec 05 13:18:51 crc kubenswrapper[4763]: I1205 13:18:51.799108 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" path="/var/lib/kubelet/pods/51dd044c-e4e0-4273-9af1-f30eb7139f03/volumes" Dec 05 13:18:52 crc kubenswrapper[4763]: I1205 13:18:52.438903 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:54 crc kubenswrapper[4763]: I1205 13:18:54.704424 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:54 crc kubenswrapper[4763]: I1205 13:18:54.706099 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6pzv" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="registry-server" containerID="cri-o://0eae43a0cc6d0e26b459eb41e74d2cb8b1bdd8dcae10416db7b0405a499a7626" gracePeriod=2 Dec 05 13:18:54 crc kubenswrapper[4763]: I1205 13:18:54.858379 4763 generic.go:334] "Generic (PLEG): container finished" podID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerID="0eae43a0cc6d0e26b459eb41e74d2cb8b1bdd8dcae10416db7b0405a499a7626" exitCode=0 Dec 05 13:18:54 crc kubenswrapper[4763]: I1205 13:18:54.858590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerDied","Data":"0eae43a0cc6d0e26b459eb41e74d2cb8b1bdd8dcae10416db7b0405a499a7626"} Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.199471 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.374839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgp4k\" (UniqueName: \"kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k\") pod \"795850cc-187b-4d22-b8bc-e9a88d2c5807\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.375160 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content\") pod \"795850cc-187b-4d22-b8bc-e9a88d2c5807\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.375363 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities\") pod \"795850cc-187b-4d22-b8bc-e9a88d2c5807\" (UID: \"795850cc-187b-4d22-b8bc-e9a88d2c5807\") " Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.376050 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities" (OuterVolumeSpecName: "utilities") pod "795850cc-187b-4d22-b8bc-e9a88d2c5807" (UID: "795850cc-187b-4d22-b8bc-e9a88d2c5807"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.380694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k" (OuterVolumeSpecName: "kube-api-access-zgp4k") pod "795850cc-187b-4d22-b8bc-e9a88d2c5807" (UID: "795850cc-187b-4d22-b8bc-e9a88d2c5807"). InnerVolumeSpecName "kube-api-access-zgp4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.430276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "795850cc-187b-4d22-b8bc-e9a88d2c5807" (UID: "795850cc-187b-4d22-b8bc-e9a88d2c5807"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.477568 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgp4k\" (UniqueName: \"kubernetes.io/projected/795850cc-187b-4d22-b8bc-e9a88d2c5807-kube-api-access-zgp4k\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.477621 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.477636 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795850cc-187b-4d22-b8bc-e9a88d2c5807-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.870000 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6pzv" event={"ID":"795850cc-187b-4d22-b8bc-e9a88d2c5807","Type":"ContainerDied","Data":"a458f6eb23850259c91e6f572742cbe777a993d999854c1080776ef2cb30b307"} Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.870054 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6pzv" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.870058 4763 scope.go:117] "RemoveContainer" containerID="0eae43a0cc6d0e26b459eb41e74d2cb8b1bdd8dcae10416db7b0405a499a7626" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.893112 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.895807 4763 scope.go:117] "RemoveContainer" containerID="786db9cbd2337232319436f5f8637e6902a0bdbc488d9e2a02c26b04b42a2029" Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.903575 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6pzv"] Dec 05 13:18:55 crc kubenswrapper[4763]: I1205 13:18:55.914465 4763 scope.go:117] "RemoveContainer" containerID="b76813082bcc669be34718bbd8d5f7e27c1e7a69d7cb17baf9cf5b63d6001c45" Dec 05 13:18:57 crc kubenswrapper[4763]: I1205 13:18:57.795188 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" path="/var/lib/kubelet/pods/795850cc-187b-4d22-b8bc-e9a88d2c5807/volumes" Dec 05 13:18:58 crc kubenswrapper[4763]: I1205 13:18:58.733664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:18:59 crc kubenswrapper[4763]: I1205 13:18:59.305546 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:18:59 crc kubenswrapper[4763]: I1205 13:18:59.305840 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b72g6" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="registry-server" containerID="cri-o://10a873104d0e5c0653b5ad890c970f21c0124cca6f999df445a726c6b6851a04" gracePeriod=2 Dec 05 13:18:59 crc kubenswrapper[4763]: I1205 13:18:59.911082 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d61dd45-0ea0-4581-9325-b42539c14249" containerID="10a873104d0e5c0653b5ad890c970f21c0124cca6f999df445a726c6b6851a04" exitCode=0 Dec 05 13:18:59 crc kubenswrapper[4763]: I1205 13:18:59.911134 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerDied","Data":"10a873104d0e5c0653b5ad890c970f21c0124cca6f999df445a726c6b6851a04"} Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.345232 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.398450 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content\") pod \"6d61dd45-0ea0-4581-9325-b42539c14249\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.398622 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities\") pod \"6d61dd45-0ea0-4581-9325-b42539c14249\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.398657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlgv\" (UniqueName: \"kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv\") pod \"6d61dd45-0ea0-4581-9325-b42539c14249\" (UID: \"6d61dd45-0ea0-4581-9325-b42539c14249\") " Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.403066 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities" (OuterVolumeSpecName: "utilities") pod "6d61dd45-0ea0-4581-9325-b42539c14249" (UID: "6d61dd45-0ea0-4581-9325-b42539c14249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.406169 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv" (OuterVolumeSpecName: "kube-api-access-gvlgv") pod "6d61dd45-0ea0-4581-9325-b42539c14249" (UID: "6d61dd45-0ea0-4581-9325-b42539c14249"). InnerVolumeSpecName "kube-api-access-gvlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.471717 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d61dd45-0ea0-4581-9325-b42539c14249" (UID: "6d61dd45-0ea0-4581-9325-b42539c14249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.501362 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.501396 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61dd45-0ea0-4581-9325-b42539c14249-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.501406 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlgv\" (UniqueName: \"kubernetes.io/projected/6d61dd45-0ea0-4581-9325-b42539c14249-kube-api-access-gvlgv\") on node \"crc\" DevicePath \"\"" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.924106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b72g6" event={"ID":"6d61dd45-0ea0-4581-9325-b42539c14249","Type":"ContainerDied","Data":"93690f7e97a0fa9dcf5ed249c434e1b73802f4d5842cb88adedf498f5798b54f"} Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.924183 4763 scope.go:117] "RemoveContainer" containerID="10a873104d0e5c0653b5ad890c970f21c0124cca6f999df445a726c6b6851a04" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.924194 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b72g6" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.945558 4763 scope.go:117] "RemoveContainer" containerID="4ac4c8d59fe1ba3752eb306c70b5420ecfbd33fe29e695f69e85f2321702cef6" Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.954694 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.967083 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b72g6"] Dec 05 13:19:00 crc kubenswrapper[4763]: I1205 13:19:00.988074 4763 scope.go:117] "RemoveContainer" containerID="f4c26e861df67186bc59ebfbd94630a44ba8bc4d9a496327cd37d879f1f55330" Dec 05 13:19:01 crc kubenswrapper[4763]: I1205 13:19:01.804254 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" path="/var/lib/kubelet/pods/6d61dd45-0ea0-4581-9325-b42539c14249/volumes" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.480652 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.628500 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.668907 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.669970 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.814615 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.861818 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/extract/0.log" Dec 05 13:19:10 crc kubenswrapper[4763]: I1205 13:19:10.880461 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.000074 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-f92rg_d7dd9586-7cc5-42f0-87a8-3a8c54557b21/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.090979 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-f92rg_d7dd9586-7cc5-42f0-87a8-3a8c54557b21/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.109752 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w8w7f_e97d9ee8-0c07-486a-84f1-dabddb037a8b/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.251340 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w8w7f_e97d9ee8-0c07-486a-84f1-dabddb037a8b/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.318662 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8gw8h_f9a5212c-2ddb-4e82-818e-5102fb3c5ee2/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.359155 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8gw8h_f9a5212c-2ddb-4e82-818e-5102fb3c5ee2/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.487184 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-77g97_94ae68ae-93ae-43a6-89fa-5b2301808793/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.546624 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-77g97_94ae68ae-93ae-43a6-89fa-5b2301808793/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.656667 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mq9f4_42f31714-9ede-4c48-b611-028a79374fad/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.702730 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mq9f4_42f31714-9ede-4c48-b611-028a79374fad/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.802677 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gbf44_2bed16d5-ec79-4ad7-8984-b965fa568dc6/kube-rbac-proxy/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.857417 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gbf44_2bed16d5-ec79-4ad7-8984-b965fa568dc6/manager/0.log" Dec 05 13:19:11 crc kubenswrapper[4763]: I1205 13:19:11.900703 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-p5jgv_f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6/kube-rbac-proxy/0.log" Dec 05 13:19:12 crc kubenswrapper[4763]: I1205 13:19:12.111886 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-2bgt8_ed1b8d49-d742-4493-bb7e-856b4108fb88/kube-rbac-proxy/0.log" Dec 05 13:19:12 crc kubenswrapper[4763]: I1205 13:19:12.142049 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-2bgt8_ed1b8d49-d742-4493-bb7e-856b4108fb88/manager/0.log" Dec 05 13:19:12 crc kubenswrapper[4763]: I1205 13:19:12.166357 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-p5jgv_f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6/manager/0.log" Dec 05 13:19:12 crc kubenswrapper[4763]: I1205 13:19:12.802375 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gx4vn_6520d187-9c6c-4b0e-b0c9-27e23db84f4c/kube-rbac-proxy/0.log" Dec 05 13:19:12 crc kubenswrapper[4763]: I1205 13:19:12.911178 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gx4vn_6520d187-9c6c-4b0e-b0c9-27e23db84f4c/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.002860 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-sq9wf_eb3c8b38-a863-42d0-b7d8-03231971e4ce/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.032145 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-sq9wf_eb3c8b38-a863-42d0-b7d8-03231971e4ce/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.120998 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-b9w6n_cd42325e-d26d-4cb6-b8dd-f75dc86e7568/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.198273 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-b9w6n_cd42325e-d26d-4cb6-b8dd-f75dc86e7568/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.292258 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_4faed118-8b9d-4adb-8f86-6a6be8061bce/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.388492 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_4faed118-8b9d-4adb-8f86-6a6be8061bce/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.432031 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mnthh_378cb9d9-8010-4dcf-9297-5e4f0679086e/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.591334 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mnthh_378cb9d9-8010-4dcf-9297-5e4f0679086e/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.648307 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6zk6f_fcc46489-05d5-4219-9e45-6ca25f25900f/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.658563 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6zk6f_fcc46489-05d5-4219-9e45-6ca25f25900f/manager/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.819274 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs_3fdf0ecb-215d-4a02-8053-169fcbfefa50/kube-rbac-proxy/0.log" Dec 05 13:19:13 crc kubenswrapper[4763]: I1205 13:19:13.834071 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs_3fdf0ecb-215d-4a02-8053-169fcbfefa50/manager/0.log" Dec 05 13:19:14 crc kubenswrapper[4763]: I1205 13:19:14.301813 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-654b7bd4cc-79gh5_9b87ef42-73e9-40c4-a64b-381de978398c/operator/0.log" Dec 05 13:19:14 crc kubenswrapper[4763]: I1205 13:19:14.733168 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jp2ck_522dda98-66e1-4ced-b504-e957eb00cda2/registry-server/0.log" Dec 05 13:19:14 crc kubenswrapper[4763]: I1205 13:19:14.932364 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mch9f_63e1e64f-8414-4da8-8a32-5f0a0041c5ff/manager/0.log" Dec 05 13:19:14 crc kubenswrapper[4763]: I1205 13:19:14.945715 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mch9f_63e1e64f-8414-4da8-8a32-5f0a0041c5ff/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.094651 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gzx9n_a05e3d8d-f58a-44f0-b3c9-e212cdcec438/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.200511 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gzx9n_a05e3d8d-f58a-44f0-b3c9-e212cdcec438/manager/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.276101 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-n8fzm_01d1c35a-adc3-4945-92b5-5921600cb826/operator/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.399666 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d6cc4d8dc-wf9nm_43f191ee-e0a3-4d9e-a63a-c9b7a626806f/manager/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.421009 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-phnl7_2024cb36-8175-4993-bd5b-a57a8fb8416c/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.423278 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-phnl7_2024cb36-8175-4993-bd5b-a57a8fb8416c/manager/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.548031 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xkl6w_b64b19c9-3601-4790-addf-c9a32f6c29fe/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.627318 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ht96c_0a36f8ad-7e41-4005-a42e-47b9a30af62f/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.690973 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ht96c_0a36f8ad-7e41-4005-a42e-47b9a30af62f/manager/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.767548 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xkl6w_b64b19c9-3601-4790-addf-c9a32f6c29fe/manager/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.862095 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66974974bb-mjwrw_36e19ef2-df0d-43ca-8477-f1cec2182b45/kube-rbac-proxy/0.log" Dec 05 13:19:15 crc kubenswrapper[4763]: I1205 13:19:15.923547 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66974974bb-mjwrw_36e19ef2-df0d-43ca-8477-f1cec2182b45/manager/0.log" Dec 05 13:19:37 crc kubenswrapper[4763]: I1205 13:19:37.197686 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7wmpg_f08e9226-6ec5-4854-9780-0b5e2d8a7ded/control-plane-machine-set-operator/0.log" Dec 05 13:19:37 crc kubenswrapper[4763]: I1205 13:19:37.405798 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j5ztd_0c9b5acf-ef6a-4bdd-ae32-582a80d711b5/machine-api-operator/0.log" Dec 05 13:19:37 crc kubenswrapper[4763]: I1205 13:19:37.643497 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j5ztd_0c9b5acf-ef6a-4bdd-ae32-582a80d711b5/kube-rbac-proxy/0.log" Dec 05 13:19:52 crc kubenswrapper[4763]: I1205 13:19:52.837410 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dcg55_96780413-b18b-4d1d-a6c4-2bebb60c99c1/cert-manager-controller/0.log" Dec 05 13:19:53 crc kubenswrapper[4763]: I1205 13:19:53.363367 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5x8rx_6f795519-6cee-426c-8dda-7f96ef62a9a1/cert-manager-webhook/0.log" Dec 05 13:19:53 crc kubenswrapper[4763]: I1205 13:19:53.374630 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d5g4s_c47efa3a-fd06-4193-921d-11f8f5fb0eff/cert-manager-cainjector/0.log" Dec 05 13:20:06 crc kubenswrapper[4763]: I1205 13:20:06.615780 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-76lxj_a3cf5928-0003-41e3-baf7-670a1f186bde/nmstate-console-plugin/0.log" Dec 05 13:20:06 crc kubenswrapper[4763]: I1205 13:20:06.849346 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p7k9h_3d3db32e-6ad0-4e60-828f-74bdcc4cf6df/nmstate-handler/0.log" Dec 05 13:20:06 crc kubenswrapper[4763]: I1205 13:20:06.870147 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-t4n5p_99db9e57-5946-4f9b-8664-d9a7fbff7042/kube-rbac-proxy/0.log" Dec 05 13:20:06 crc kubenswrapper[4763]: I1205 13:20:06.945454 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-t4n5p_99db9e57-5946-4f9b-8664-d9a7fbff7042/nmstate-metrics/0.log" Dec 05 13:20:07 crc kubenswrapper[4763]: I1205 13:20:07.026883 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-72lb2_a5db489a-42dd-46c0-825d-5dc7065c9f29/nmstate-operator/0.log" Dec 05 13:20:07 crc kubenswrapper[4763]: I1205 13:20:07.149428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-4rx2l_73c52155-582b-4ea6-8661-c03a3804fe2e/nmstate-webhook/0.log" Dec 05 13:20:23 crc kubenswrapper[4763]: I1205 13:20:23.955739 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.118329 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.161718 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.297360 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.312494 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.569294 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.612517 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.768870 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.769067 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.919022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:20:24 crc kubenswrapper[4763]: I1205 13:20:24.965460 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:20:25 crc kubenswrapper[4763]: I1205 13:20:25.264190 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:20:25 crc kubenswrapper[4763]: I1205 13:20:25.304660 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/controller/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.025955 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-scmzk_a34bf611-cb4c-44b4-bdf2-45a656edadc9/kube-rbac-proxy/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.045978 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/frr-metrics/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.228161 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/kube-rbac-proxy/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.259373 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-scmzk_a34bf611-cb4c-44b4-bdf2-45a656edadc9/controller/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.309788 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/kube-rbac-proxy-frr/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.349644 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/reloader/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.499313 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-qv5hl_cfa0736f-2856-4cfd-810f-d8fcd2bea7f6/frr-k8s-webhook-server/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.628337 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84d94bbc7d-rf87g_d47b8a4e-ccc5-41e4-855b-86fee8fed449/manager/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.777091 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b69b886bc-52sm5_e3b7dc32-b6b1-4087-9518-da66dd2c1839/webhook-server/0.log" Dec 05 13:20:27 crc kubenswrapper[4763]: I1205 13:20:27.875000 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2k2k4_6af0da26-fcd3-4eb1-97a2-e5beedf81d5b/kube-rbac-proxy/0.log" Dec 05 13:20:29 crc kubenswrapper[4763]: I1205 13:20:29.099051 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2k2k4_6af0da26-fcd3-4eb1-97a2-e5beedf81d5b/speaker/0.log" Dec 05 13:20:29 crc kubenswrapper[4763]: I1205 13:20:29.456202 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/frr/0.log" Dec 05 13:20:37 crc kubenswrapper[4763]: I1205 13:20:37.544900 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:20:37 crc kubenswrapper[4763]: I1205 13:20:37.545491 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.435519 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.581527 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.600450 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.636295 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.800333 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.804228 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.825632 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/extract/0.log" Dec 05 13:20:41 crc kubenswrapper[4763]: I1205 13:20:41.956259 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.190386 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.190492 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.190813 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.358847 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.374657 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/extract/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.415827 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.612246 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.741769 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.757409 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.762079 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.908924 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.917250 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:20:42 crc kubenswrapper[4763]: I1205 13:20:42.940092 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/extract/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.072446 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.253965 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.262753 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.262966 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.439639 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.453234 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.673901 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.907676 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.929086 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:20:43 crc kubenswrapper[4763]: I1205 13:20:43.932701 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.092397 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.189943 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.198370 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/registry-server/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.369947 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cqpn4_8fc5438b-109a-4bf8-97a6-d5c49edbc395/marketplace-operator/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.593443 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.793950 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.801669 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:20:44 crc kubenswrapper[4763]: I1205 13:20:44.857282 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.018229 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.023936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.137305 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/registry-server/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.213302 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.290161 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/registry-server/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.433892 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.447589 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.485178 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.623625 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:20:45 crc kubenswrapper[4763]: I1205 13:20:45.628968 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:20:46 crc kubenswrapper[4763]: I1205 13:20:46.480373 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/registry-server/0.log" Dec 05 13:20:58 crc kubenswrapper[4763]: I1205 13:20:58.696515 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-zsfpr_e1ccdc7d-9781-4086-b0a7-7a777c943bcb/prometheus-operator/0.log" Dec 05 13:20:58 crc kubenswrapper[4763]: I1205 13:20:58.906623 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c_e723fc3f-3161-40a0-becd-a17210dbd266/prometheus-operator-admission-webhook/0.log" Dec 05 13:20:58 crc kubenswrapper[4763]: I1205 13:20:58.936459 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm_e25a7208-54a3-4a23-a355-8bbd34b81ace/prometheus-operator-admission-webhook/0.log" Dec 05 13:20:59 crc kubenswrapper[4763]: I1205 13:20:59.113472 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-djkw9_f8052a23-847b-4419-af86-e56c327c367b/operator/0.log" Dec 05 13:20:59 crc kubenswrapper[4763]: I1205 13:20:59.155398 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-zx5r7_1603ef68-55d9-49dc-bbe4-93b129fe1b29/perses-operator/0.log" Dec 05 13:21:07 crc kubenswrapper[4763]: I1205 13:21:07.544435 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:21:07 crc kubenswrapper[4763]: I1205 13:21:07.544902 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:21:37 crc kubenswrapper[4763]: I1205 13:21:37.545505 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:21:37 crc kubenswrapper[4763]: I1205 13:21:37.546287 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:21:37 crc kubenswrapper[4763]: I1205 13:21:37.546409 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:21:37 crc kubenswrapper[4763]: I1205 13:21:37.549044 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:21:37 crc kubenswrapper[4763]: I1205 13:21:37.549178 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068" gracePeriod=600 Dec 05 13:21:38 crc kubenswrapper[4763]: I1205 13:21:38.570960 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068" exitCode=0 Dec 05 13:21:38 crc kubenswrapper[4763]: I1205 13:21:38.571184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068"} Dec 05 13:21:38 crc kubenswrapper[4763]: I1205 13:21:38.571827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9"} Dec 05 13:21:38 crc kubenswrapper[4763]: I1205 13:21:38.571865 4763 scope.go:117] "RemoveContainer" containerID="667fb7d9fd133388c08f9883162f086ec3353ccdc0cda16fb2999b809ed9835f" Dec 05 13:22:52 crc kubenswrapper[4763]: I1205 13:22:52.454069 4763 generic.go:334] "Generic (PLEG): container finished" podID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerID="de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2" exitCode=0 Dec 05 13:22:52 crc kubenswrapper[4763]: I1205 13:22:52.454148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnz8t/must-gather-xstqw" event={"ID":"95c5d8b4-59ac-42ec-971d-efef222bf2ae","Type":"ContainerDied","Data":"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2"} Dec 05 13:22:52 crc kubenswrapper[4763]: I1205 13:22:52.455156 4763 scope.go:117] "RemoveContainer" containerID="de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2" Dec 05 13:22:53 crc kubenswrapper[4763]: I1205 13:22:53.082237 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnz8t_must-gather-xstqw_95c5d8b4-59ac-42ec-971d-efef222bf2ae/gather/0.log" Dec 05 13:23:01 crc kubenswrapper[4763]: I1205 13:23:01.502598 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnz8t/must-gather-xstqw"] Dec 05 13:23:01 crc kubenswrapper[4763]: I1205 13:23:01.503442 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qnz8t/must-gather-xstqw" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="copy" containerID="cri-o://7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f" gracePeriod=2 Dec 05 13:23:01 crc kubenswrapper[4763]: I1205 13:23:01.515366 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnz8t/must-gather-xstqw"] Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.121958 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnz8t_must-gather-xstqw_95c5d8b4-59ac-42ec-971d-efef222bf2ae/copy/0.log" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.122960 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.192953 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output\") pod \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.193040 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpm6\" (UniqueName: \"kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6\") pod \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\" (UID: \"95c5d8b4-59ac-42ec-971d-efef222bf2ae\") " Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.199228 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6" (OuterVolumeSpecName: "kube-api-access-ljpm6") pod "95c5d8b4-59ac-42ec-971d-efef222bf2ae" (UID: "95c5d8b4-59ac-42ec-971d-efef222bf2ae"). InnerVolumeSpecName "kube-api-access-ljpm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.295552 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpm6\" (UniqueName: \"kubernetes.io/projected/95c5d8b4-59ac-42ec-971d-efef222bf2ae-kube-api-access-ljpm6\") on node \"crc\" DevicePath \"\"" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.368132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "95c5d8b4-59ac-42ec-971d-efef222bf2ae" (UID: "95c5d8b4-59ac-42ec-971d-efef222bf2ae"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.397317 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95c5d8b4-59ac-42ec-971d-efef222bf2ae-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.560657 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnz8t_must-gather-xstqw_95c5d8b4-59ac-42ec-971d-efef222bf2ae/copy/0.log" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.561136 4763 generic.go:334] "Generic (PLEG): container finished" podID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerID="7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f" exitCode=143 Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.561218 4763 scope.go:117] "RemoveContainer" containerID="7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.561431 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnz8t/must-gather-xstqw" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.582906 4763 scope.go:117] "RemoveContainer" containerID="de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.684703 4763 scope.go:117] "RemoveContainer" containerID="7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f" Dec 05 13:23:02 crc kubenswrapper[4763]: E1205 13:23:02.685102 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f\": container with ID starting with 7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f not found: ID does not exist" containerID="7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.685142 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f"} err="failed to get container status \"7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f\": rpc error: code = NotFound desc = could not find container \"7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f\": container with ID starting with 7a47c3b5a864b1ac257c0c38413aae921ce5acd6f3b20631e032e5ef753b493f not found: ID does not exist" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.685169 4763 scope.go:117] "RemoveContainer" containerID="de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2" Dec 05 13:23:02 crc kubenswrapper[4763]: E1205 13:23:02.685486 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2\": container with ID starting with de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2 not found: ID does not exist" containerID="de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2" Dec 05 13:23:02 crc kubenswrapper[4763]: I1205 13:23:02.685529 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2"} err="failed to get container status \"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2\": rpc error: code = NotFound desc = could not find container \"de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2\": container with ID starting with de5e413d6883db2536b370fba6ec2c59716eff2f6647c449caec572e652433b2 not found: ID does not exist" Dec 05 13:23:03 crc kubenswrapper[4763]: I1205 13:23:03.793966 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" path="/var/lib/kubelet/pods/95c5d8b4-59ac-42ec-971d-efef222bf2ae/volumes" Dec 05 13:23:37 crc kubenswrapper[4763]: I1205 13:23:37.544225 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:23:37 crc kubenswrapper[4763]: I1205 13:23:37.544684 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:23:54 crc kubenswrapper[4763]: I1205 13:23:54.369315 4763 scope.go:117] "RemoveContainer" containerID="9970d0d66cd57dbcb398268322374a193ed6615a7278dfc54831cf1ea5f9aa77" Dec 05 13:24:07 crc kubenswrapper[4763]: I1205 13:24:07.544382 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:24:07 crc kubenswrapper[4763]: I1205 13:24:07.544967 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:24:37 crc kubenswrapper[4763]: I1205 13:24:37.543924 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:24:37 crc kubenswrapper[4763]: I1205 13:24:37.544590 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:24:37 crc kubenswrapper[4763]: I1205 13:24:37.544648 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:24:37 crc kubenswrapper[4763]: I1205 13:24:37.545633 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:24:37 crc kubenswrapper[4763]: I1205 13:24:37.545708 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" gracePeriod=600 Dec 05 13:24:37 crc kubenswrapper[4763]: E1205 13:24:37.671175 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:24:38 crc kubenswrapper[4763]: I1205 13:24:38.578164 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" exitCode=0 Dec 05 13:24:38 crc kubenswrapper[4763]: I1205 13:24:38.578219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9"} Dec 05 13:24:38 crc kubenswrapper[4763]: I1205 13:24:38.578262 4763 scope.go:117] "RemoveContainer" containerID="d5a207a9aaaada6da735fff3ca1047890e1ffa0b0b0038e57f43b8011721d068" Dec 05 13:24:38 crc kubenswrapper[4763]: I1205 13:24:38.578866 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:24:38 crc kubenswrapper[4763]: E1205 13:24:38.579410 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:24:53 crc kubenswrapper[4763]: I1205 13:24:53.785752 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:24:53 crc kubenswrapper[4763]: E1205 13:24:53.786848 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:25:06 crc kubenswrapper[4763]: I1205 13:25:06.783547 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:25:06 crc kubenswrapper[4763]: E1205 13:25:06.784149 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:25:17 crc kubenswrapper[4763]: I1205 13:25:17.785284 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:25:17 crc kubenswrapper[4763]: E1205 13:25:17.786493 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:25:29 crc kubenswrapper[4763]: I1205 13:25:29.784291 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:25:29 crc kubenswrapper[4763]: E1205 13:25:29.784936 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:25:42 crc kubenswrapper[4763]: I1205 13:25:42.784253 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:25:42 crc kubenswrapper[4763]: E1205 13:25:42.785307 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:25:54 crc kubenswrapper[4763]: I1205 13:25:54.787233 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:25:54 crc kubenswrapper[4763]: E1205 13:25:54.788600 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.585226 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586466 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586489 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586524 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586538 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586570 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586584 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586599 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586612 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586645 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="copy" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586657 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="copy" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586676 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586690 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586719 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586733 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="extract-content" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586748 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586767 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586837 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586853 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="extract-utilities" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586874 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586887 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: E1205 13:26:07.586911 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="gather" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.586924 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="gather" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.587262 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="51dd044c-e4e0-4273-9af1-f30eb7139f03" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.587289 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="795850cc-187b-4d22-b8bc-e9a88d2c5807" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.587320 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d61dd45-0ea0-4581-9325-b42539c14249" containerName="registry-server" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.587356 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="copy" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.587388 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c5d8b4-59ac-42ec-971d-efef222bf2ae" containerName="gather" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.590106 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.600729 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.642805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.642933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxpg\" (UniqueName: \"kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.643029 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.745093 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.745189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxpg\" (UniqueName: \"kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.745261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.745678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.745705 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.777485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxpg\" (UniqueName: \"kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg\") pod \"redhat-operators-t8jcn\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:07 crc kubenswrapper[4763]: I1205 13:26:07.924198 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:08 crc kubenswrapper[4763]: I1205 13:26:08.390663 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:08 crc kubenswrapper[4763]: W1205 13:26:08.395532 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860d9191_22d9_44c1_b0ff_de810ebd1394.slice/crio-ae03fc9775e74b021f48f9e85d1408645a92b257b43812b67487463418498cf6 WatchSource:0}: Error finding container ae03fc9775e74b021f48f9e85d1408645a92b257b43812b67487463418498cf6: Status 404 returned error can't find the container with id ae03fc9775e74b021f48f9e85d1408645a92b257b43812b67487463418498cf6 Dec 05 13:26:08 crc kubenswrapper[4763]: I1205 13:26:08.609999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerStarted","Data":"ae03fc9775e74b021f48f9e85d1408645a92b257b43812b67487463418498cf6"} Dec 05 13:26:08 crc kubenswrapper[4763]: I1205 13:26:08.783856 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:26:08 crc kubenswrapper[4763]: E1205 13:26:08.784274 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:26:09 crc kubenswrapper[4763]: I1205 13:26:09.624949 4763 generic.go:334] "Generic (PLEG): container finished" podID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerID="e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153" exitCode=0 Dec 05 13:26:09 crc kubenswrapper[4763]: I1205 13:26:09.625011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerDied","Data":"e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153"} Dec 05 13:26:09 crc kubenswrapper[4763]: I1205 13:26:09.628059 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:26:11 crc kubenswrapper[4763]: I1205 13:26:11.651674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerStarted","Data":"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37"} Dec 05 13:26:13 crc kubenswrapper[4763]: I1205 13:26:13.685067 4763 generic.go:334] "Generic (PLEG): container finished" podID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerID="fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37" exitCode=0 Dec 05 13:26:13 crc kubenswrapper[4763]: I1205 13:26:13.685232 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerDied","Data":"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37"} Dec 05 13:26:23 crc kubenswrapper[4763]: I1205 13:26:23.784718 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:26:23 crc kubenswrapper[4763]: E1205 13:26:23.786035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.647643 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hksp6/must-gather-ncx8z"] Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.649905 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.651896 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hksp6"/"openshift-service-ca.crt" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.652228 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hksp6"/"kube-root-ca.crt" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.652622 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hksp6"/"default-dockercfg-2h2ws" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.667228 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hksp6/must-gather-ncx8z"] Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.690517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.690923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchqh\" (UniqueName: \"kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.793170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.793386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchqh\" (UniqueName: \"kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.793655 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.825390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchqh\" (UniqueName: \"kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh\") pod \"must-gather-ncx8z\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:27 crc kubenswrapper[4763]: I1205 13:26:27.976501 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:26:28 crc kubenswrapper[4763]: I1205 13:26:28.455247 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hksp6/must-gather-ncx8z"] Dec 05 13:26:28 crc kubenswrapper[4763]: I1205 13:26:28.895658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/must-gather-ncx8z" event={"ID":"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533","Type":"ContainerStarted","Data":"e96f69543577c842b0bab40c02e67d083c0beeeada751c42eecb08e358c1b4b3"} Dec 05 13:26:30 crc kubenswrapper[4763]: I1205 13:26:30.956231 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/must-gather-ncx8z" event={"ID":"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533","Type":"ContainerStarted","Data":"07fda1d92b4d191beb7a0fc53089db00bb92eaa09ba7c94bc8db6b73d809dbfc"} Dec 05 13:26:32 crc kubenswrapper[4763]: I1205 13:26:32.978014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/must-gather-ncx8z" event={"ID":"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533","Type":"ContainerStarted","Data":"f5fc28e02ad48f2b95fc4e8158479b1fb8d1cadfd4d9803f6d39432f9aafd761"} Dec 05 13:26:32 crc kubenswrapper[4763]: I1205 13:26:32.981917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerStarted","Data":"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2"} Dec 05 13:26:33 crc kubenswrapper[4763]: I1205 13:26:33.002226 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t8jcn" podStartSLOduration=3.926362999 podStartE2EDuration="26.002210153s" podCreationTimestamp="2025-12-05 13:26:07 +0000 UTC" firstStartedPulling="2025-12-05 13:26:09.627707078 +0000 UTC m=+5854.120421811" lastFinishedPulling="2025-12-05 13:26:31.703554242 +0000 UTC m=+5876.196268965" observedRunningTime="2025-12-05 13:26:32.997490197 +0000 UTC m=+5877.490204920" watchObservedRunningTime="2025-12-05 13:26:33.002210153 +0000 UTC m=+5877.494924876" Dec 05 13:26:33 crc kubenswrapper[4763]: I1205 13:26:33.985274 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hksp6/crc-debug-tkkr2"] Dec 05 13:26:33 crc kubenswrapper[4763]: I1205 13:26:33.987548 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.026652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.027142 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsd6\" (UniqueName: \"kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.129372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vsd6\" (UniqueName: \"kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.129623 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.129761 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.154621 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vsd6\" (UniqueName: \"kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6\") pod \"crc-debug-tkkr2\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:34 crc kubenswrapper[4763]: I1205 13:26:34.305054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:26:35 crc kubenswrapper[4763]: I1205 13:26:35.005293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" event={"ID":"8a861dce-28f8-4a68-9215-3ec9dc2b0727","Type":"ContainerStarted","Data":"d06054db88ee6e91310ac385615ba56df10973b9e7f567c9da4f85cdcc0c2299"} Dec 05 13:26:35 crc kubenswrapper[4763]: I1205 13:26:35.005838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" event={"ID":"8a861dce-28f8-4a68-9215-3ec9dc2b0727","Type":"ContainerStarted","Data":"873c5089110f4f850e7668b42931f703c0570321c0fb44885f3fad63c2732a9a"} Dec 05 13:26:35 crc kubenswrapper[4763]: I1205 13:26:35.019291 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hksp6/must-gather-ncx8z" podStartSLOduration=8.01927952 podStartE2EDuration="8.01927952s" podCreationTimestamp="2025-12-05 13:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:26:34.029230604 +0000 UTC m=+5878.521945327" watchObservedRunningTime="2025-12-05 13:26:35.01927952 +0000 UTC m=+5879.511994243" Dec 05 13:26:35 crc kubenswrapper[4763]: I1205 13:26:35.022333 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" podStartSLOduration=2.022326752 podStartE2EDuration="2.022326752s" podCreationTimestamp="2025-12-05 13:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:26:35.015586701 +0000 UTC m=+5879.508301424" watchObservedRunningTime="2025-12-05 13:26:35.022326752 +0000 UTC m=+5879.515041475" Dec 05 13:26:37 crc kubenswrapper[4763]: I1205 13:26:37.925359 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:37 crc kubenswrapper[4763]: I1205 13:26:37.927003 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:37 crc kubenswrapper[4763]: I1205 13:26:37.977861 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:38 crc kubenswrapper[4763]: I1205 13:26:38.086331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:38 crc kubenswrapper[4763]: I1205 13:26:38.785106 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:26:38 crc kubenswrapper[4763]: E1205 13:26:38.785821 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:26:38 crc kubenswrapper[4763]: I1205 13:26:38.786185 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.055667 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t8jcn" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="registry-server" containerID="cri-o://d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2" gracePeriod=2 Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.570629 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.754083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities\") pod \"860d9191-22d9-44c1-b0ff-de810ebd1394\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.754184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content\") pod \"860d9191-22d9-44c1-b0ff-de810ebd1394\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.754292 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxpg\" (UniqueName: \"kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg\") pod \"860d9191-22d9-44c1-b0ff-de810ebd1394\" (UID: \"860d9191-22d9-44c1-b0ff-de810ebd1394\") " Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.755121 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities" (OuterVolumeSpecName: "utilities") pod "860d9191-22d9-44c1-b0ff-de810ebd1394" (UID: "860d9191-22d9-44c1-b0ff-de810ebd1394"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.765004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg" (OuterVolumeSpecName: "kube-api-access-5lxpg") pod "860d9191-22d9-44c1-b0ff-de810ebd1394" (UID: "860d9191-22d9-44c1-b0ff-de810ebd1394"). InnerVolumeSpecName "kube-api-access-5lxpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.856476 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxpg\" (UniqueName: \"kubernetes.io/projected/860d9191-22d9-44c1-b0ff-de810ebd1394-kube-api-access-5lxpg\") on node \"crc\" DevicePath \"\"" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.856515 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.864154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "860d9191-22d9-44c1-b0ff-de810ebd1394" (UID: "860d9191-22d9-44c1-b0ff-de810ebd1394"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:26:40 crc kubenswrapper[4763]: I1205 13:26:40.958085 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860d9191-22d9-44c1-b0ff-de810ebd1394-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.065349 4763 generic.go:334] "Generic (PLEG): container finished" podID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerID="d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2" exitCode=0 Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.065390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerDied","Data":"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2"} Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.065440 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8jcn" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.065452 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8jcn" event={"ID":"860d9191-22d9-44c1-b0ff-de810ebd1394","Type":"ContainerDied","Data":"ae03fc9775e74b021f48f9e85d1408645a92b257b43812b67487463418498cf6"} Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.065473 4763 scope.go:117] "RemoveContainer" containerID="d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.089444 4763 scope.go:117] "RemoveContainer" containerID="fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.112316 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.124685 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t8jcn"] Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.133619 4763 scope.go:117] "RemoveContainer" containerID="e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.172214 4763 scope.go:117] "RemoveContainer" containerID="d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2" Dec 05 13:26:41 crc kubenswrapper[4763]: E1205 13:26:41.172657 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2\": container with ID starting with d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2 not found: ID does not exist" containerID="d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.172705 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2"} err="failed to get container status \"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2\": rpc error: code = NotFound desc = could not find container \"d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2\": container with ID starting with d7dc8d916d1e036794e2d0afac0044c40ba4fc13f42af7aad18dabb6c2e2d1f2 not found: ID does not exist" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.172738 4763 scope.go:117] "RemoveContainer" containerID="fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37" Dec 05 13:26:41 crc kubenswrapper[4763]: E1205 13:26:41.173120 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37\": container with ID starting with fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37 not found: ID does not exist" containerID="fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.173145 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37"} err="failed to get container status \"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37\": rpc error: code = NotFound desc = could not find container \"fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37\": container with ID starting with fa3b9880cfb534677a2df60181ae919b83e22b729dce95f57f36448a747c8d37 not found: ID does not exist" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.173164 4763 scope.go:117] "RemoveContainer" containerID="e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153" Dec 05 13:26:41 crc kubenswrapper[4763]: E1205 13:26:41.173390 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153\": container with ID starting with e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153 not found: ID does not exist" containerID="e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.173416 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153"} err="failed to get container status \"e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153\": rpc error: code = NotFound desc = could not find container \"e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153\": container with ID starting with e4f50d13b38c8b6df208316ff72de9c2885ca2a9c913a8088257475b9208a153 not found: ID does not exist" Dec 05 13:26:41 crc kubenswrapper[4763]: I1205 13:26:41.795746 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" path="/var/lib/kubelet/pods/860d9191-22d9-44c1-b0ff-de810ebd1394/volumes" Dec 05 13:26:49 crc kubenswrapper[4763]: I1205 13:26:49.783739 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:26:49 crc kubenswrapper[4763]: E1205 13:26:49.784437 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:01 crc kubenswrapper[4763]: I1205 13:27:01.784014 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:27:01 crc kubenswrapper[4763]: E1205 13:27:01.784851 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:14 crc kubenswrapper[4763]: I1205 13:27:14.784939 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:27:14 crc kubenswrapper[4763]: E1205 13:27:14.785867 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:22 crc kubenswrapper[4763]: I1205 13:27:22.707011 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a861dce-28f8-4a68-9215-3ec9dc2b0727" containerID="d06054db88ee6e91310ac385615ba56df10973b9e7f567c9da4f85cdcc0c2299" exitCode=0 Dec 05 13:27:22 crc kubenswrapper[4763]: I1205 13:27:22.707143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" event={"ID":"8a861dce-28f8-4a68-9215-3ec9dc2b0727","Type":"ContainerDied","Data":"d06054db88ee6e91310ac385615ba56df10973b9e7f567c9da4f85cdcc0c2299"} Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.869947 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.878744 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host\") pod \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.879011 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host" (OuterVolumeSpecName: "host") pod "8a861dce-28f8-4a68-9215-3ec9dc2b0727" (UID: "8a861dce-28f8-4a68-9215-3ec9dc2b0727"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.879066 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vsd6\" (UniqueName: \"kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6\") pod \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\" (UID: \"8a861dce-28f8-4a68-9215-3ec9dc2b0727\") " Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.879659 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a861dce-28f8-4a68-9215-3ec9dc2b0727-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.899246 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6" (OuterVolumeSpecName: "kube-api-access-2vsd6") pod "8a861dce-28f8-4a68-9215-3ec9dc2b0727" (UID: "8a861dce-28f8-4a68-9215-3ec9dc2b0727"). InnerVolumeSpecName "kube-api-access-2vsd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.913013 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-tkkr2"] Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.925110 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-tkkr2"] Dec 05 13:27:23 crc kubenswrapper[4763]: I1205 13:27:23.981417 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vsd6\" (UniqueName: \"kubernetes.io/projected/8a861dce-28f8-4a68-9215-3ec9dc2b0727-kube-api-access-2vsd6\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:24 crc kubenswrapper[4763]: I1205 13:27:24.764129 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873c5089110f4f850e7668b42931f703c0570321c0fb44885f3fad63c2732a9a" Dec 05 13:27:24 crc kubenswrapper[4763]: I1205 13:27:24.764183 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-tkkr2" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.098121 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hksp6/crc-debug-stqn5"] Dec 05 13:27:25 crc kubenswrapper[4763]: E1205 13:27:25.098635 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a861dce-28f8-4a68-9215-3ec9dc2b0727" containerName="container-00" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.098654 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a861dce-28f8-4a68-9215-3ec9dc2b0727" containerName="container-00" Dec 05 13:27:25 crc kubenswrapper[4763]: E1205 13:27:25.098673 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="extract-utilities" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.098682 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="extract-utilities" Dec 05 13:27:25 crc kubenswrapper[4763]: E1205 13:27:25.098696 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="registry-server" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.098705 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="registry-server" Dec 05 13:27:25 crc kubenswrapper[4763]: E1205 13:27:25.098715 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="extract-content" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.098723 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="extract-content" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.099037 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a861dce-28f8-4a68-9215-3ec9dc2b0727" containerName="container-00" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.099065 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="860d9191-22d9-44c1-b0ff-de810ebd1394" containerName="registry-server" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.099870 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.104839 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.104908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwgwv\" (UniqueName: \"kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.206712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.206774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwgwv\" (UniqueName: \"kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.208254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.232301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwgwv\" (UniqueName: \"kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv\") pod \"crc-debug-stqn5\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.424660 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:25 crc kubenswrapper[4763]: W1205 13:27:25.462121 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb1ddc2b_588b_412b_9761_3bfce82ca0d3.slice/crio-ab973981a52191bca43ff558572fcaff81719758a0e42c419a027b7471c27f1f WatchSource:0}: Error finding container ab973981a52191bca43ff558572fcaff81719758a0e42c419a027b7471c27f1f: Status 404 returned error can't find the container with id ab973981a52191bca43ff558572fcaff81719758a0e42c419a027b7471c27f1f Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.776319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-stqn5" event={"ID":"bb1ddc2b-588b-412b-9761-3bfce82ca0d3","Type":"ContainerStarted","Data":"69cc92063efa3216ef9d633331d008ac9a9b441475d3ba941292c828e1948c6b"} Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.776655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-stqn5" event={"ID":"bb1ddc2b-588b-412b-9761-3bfce82ca0d3","Type":"ContainerStarted","Data":"ab973981a52191bca43ff558572fcaff81719758a0e42c419a027b7471c27f1f"} Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.798405 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hksp6/crc-debug-stqn5" podStartSLOduration=0.798390203 podStartE2EDuration="798.390203ms" podCreationTimestamp="2025-12-05 13:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:27:25.797123929 +0000 UTC m=+5930.289838692" watchObservedRunningTime="2025-12-05 13:27:25.798390203 +0000 UTC m=+5930.291104926" Dec 05 13:27:25 crc kubenswrapper[4763]: I1205 13:27:25.821701 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a861dce-28f8-4a68-9215-3ec9dc2b0727" path="/var/lib/kubelet/pods/8a861dce-28f8-4a68-9215-3ec9dc2b0727/volumes" Dec 05 13:27:26 crc kubenswrapper[4763]: I1205 13:27:26.804482 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb1ddc2b-588b-412b-9761-3bfce82ca0d3" containerID="69cc92063efa3216ef9d633331d008ac9a9b441475d3ba941292c828e1948c6b" exitCode=0 Dec 05 13:27:26 crc kubenswrapper[4763]: I1205 13:27:26.804716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-stqn5" event={"ID":"bb1ddc2b-588b-412b-9761-3bfce82ca0d3","Type":"ContainerDied","Data":"69cc92063efa3216ef9d633331d008ac9a9b441475d3ba941292c828e1948c6b"} Dec 05 13:27:27 crc kubenswrapper[4763]: I1205 13:27:27.948600 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.053335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host\") pod \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.053426 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwgwv\" (UniqueName: \"kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv\") pod \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\" (UID: \"bb1ddc2b-588b-412b-9761-3bfce82ca0d3\") " Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.053470 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host" (OuterVolumeSpecName: "host") pod "bb1ddc2b-588b-412b-9761-3bfce82ca0d3" (UID: "bb1ddc2b-588b-412b-9761-3bfce82ca0d3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.053869 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.060382 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv" (OuterVolumeSpecName: "kube-api-access-hwgwv") pod "bb1ddc2b-588b-412b-9761-3bfce82ca0d3" (UID: "bb1ddc2b-588b-412b-9761-3bfce82ca0d3"). InnerVolumeSpecName "kube-api-access-hwgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.155333 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwgwv\" (UniqueName: \"kubernetes.io/projected/bb1ddc2b-588b-412b-9761-3bfce82ca0d3-kube-api-access-hwgwv\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.202322 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-stqn5"] Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.214088 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-stqn5"] Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.784309 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:27:28 crc kubenswrapper[4763]: E1205 13:27:28.785181 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.825562 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab973981a52191bca43ff558572fcaff81719758a0e42c419a027b7471c27f1f" Dec 05 13:27:28 crc kubenswrapper[4763]: I1205 13:27:28.825616 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-stqn5" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.407026 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hksp6/crc-debug-gj476"] Dec 05 13:27:29 crc kubenswrapper[4763]: E1205 13:27:29.407462 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1ddc2b-588b-412b-9761-3bfce82ca0d3" containerName="container-00" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.407478 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1ddc2b-588b-412b-9761-3bfce82ca0d3" containerName="container-00" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.407717 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1ddc2b-588b-412b-9761-3bfce82ca0d3" containerName="container-00" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.408344 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.579719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.580260 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nmgq\" (UniqueName: \"kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.682791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.682941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.682960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nmgq\" (UniqueName: \"kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.707809 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nmgq\" (UniqueName: \"kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq\") pod \"crc-debug-gj476\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.729959 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:29 crc kubenswrapper[4763]: W1205 13:27:29.773098 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39d00354_d26f_40e1_8426_97fc5cb77cd1.slice/crio-bc21ae39e9244bcb54504efa6c768ab9a5a1a6de848f3b761474c9f1196e5b80 WatchSource:0}: Error finding container bc21ae39e9244bcb54504efa6c768ab9a5a1a6de848f3b761474c9f1196e5b80: Status 404 returned error can't find the container with id bc21ae39e9244bcb54504efa6c768ab9a5a1a6de848f3b761474c9f1196e5b80 Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.811798 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1ddc2b-588b-412b-9761-3bfce82ca0d3" path="/var/lib/kubelet/pods/bb1ddc2b-588b-412b-9761-3bfce82ca0d3/volumes" Dec 05 13:27:29 crc kubenswrapper[4763]: I1205 13:27:29.837665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-gj476" event={"ID":"39d00354-d26f-40e1-8426-97fc5cb77cd1","Type":"ContainerStarted","Data":"bc21ae39e9244bcb54504efa6c768ab9a5a1a6de848f3b761474c9f1196e5b80"} Dec 05 13:27:30 crc kubenswrapper[4763]: I1205 13:27:30.850630 4763 generic.go:334] "Generic (PLEG): container finished" podID="39d00354-d26f-40e1-8426-97fc5cb77cd1" containerID="42cca8ad201d6736a373c46b83fcbf9062090283e8d697f0d72de928838943ce" exitCode=0 Dec 05 13:27:30 crc kubenswrapper[4763]: I1205 13:27:30.851054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/crc-debug-gj476" event={"ID":"39d00354-d26f-40e1-8426-97fc5cb77cd1","Type":"ContainerDied","Data":"42cca8ad201d6736a373c46b83fcbf9062090283e8d697f0d72de928838943ce"} Dec 05 13:27:30 crc kubenswrapper[4763]: I1205 13:27:30.914748 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-gj476"] Dec 05 13:27:30 crc kubenswrapper[4763]: I1205 13:27:30.925334 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hksp6/crc-debug-gj476"] Dec 05 13:27:31 crc kubenswrapper[4763]: I1205 13:27:31.978061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.143482 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nmgq\" (UniqueName: \"kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq\") pod \"39d00354-d26f-40e1-8426-97fc5cb77cd1\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.143624 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host\") pod \"39d00354-d26f-40e1-8426-97fc5cb77cd1\" (UID: \"39d00354-d26f-40e1-8426-97fc5cb77cd1\") " Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.144416 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host" (OuterVolumeSpecName: "host") pod "39d00354-d26f-40e1-8426-97fc5cb77cd1" (UID: "39d00354-d26f-40e1-8426-97fc5cb77cd1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.158972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq" (OuterVolumeSpecName: "kube-api-access-6nmgq") pod "39d00354-d26f-40e1-8426-97fc5cb77cd1" (UID: "39d00354-d26f-40e1-8426-97fc5cb77cd1"). InnerVolumeSpecName "kube-api-access-6nmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.245735 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nmgq\" (UniqueName: \"kubernetes.io/projected/39d00354-d26f-40e1-8426-97fc5cb77cd1-kube-api-access-6nmgq\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.246016 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d00354-d26f-40e1-8426-97fc5cb77cd1-host\") on node \"crc\" DevicePath \"\"" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.875487 4763 scope.go:117] "RemoveContainer" containerID="42cca8ad201d6736a373c46b83fcbf9062090283e8d697f0d72de928838943ce" Dec 05 13:27:32 crc kubenswrapper[4763]: I1205 13:27:32.875718 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/crc-debug-gj476" Dec 05 13:27:33 crc kubenswrapper[4763]: I1205 13:27:33.795738 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d00354-d26f-40e1-8426-97fc5cb77cd1" path="/var/lib/kubelet/pods/39d00354-d26f-40e1-8426-97fc5cb77cd1/volumes" Dec 05 13:27:43 crc kubenswrapper[4763]: I1205 13:27:43.784646 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:27:43 crc kubenswrapper[4763]: E1205 13:27:43.785295 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:58 crc kubenswrapper[4763]: I1205 13:27:58.783788 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:27:58 crc kubenswrapper[4763]: E1205 13:27:58.784638 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:27:58 crc kubenswrapper[4763]: I1205 13:27:58.961362 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f846788f8-4gznp_80c384c8-1d13-47af-b978-f724e40e99af/barbican-api/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.298394 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f846788f8-4gznp_80c384c8-1d13-47af-b978-f724e40e99af/barbican-api-log/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.406293 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-94d865894-tqt5m_76cf4acb-9763-4dac-9a2f-eba4a98314f0/barbican-keystone-listener/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.464198 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-94d865894-tqt5m_76cf4acb-9763-4dac-9a2f-eba4a98314f0/barbican-keystone-listener-log/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.548678 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5446b6d8dc-p784q_d23af5c6-295f-4c65-90a1-02e66a41f325/barbican-worker/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.582766 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5446b6d8dc-p784q_d23af5c6-295f-4c65-90a1-02e66a41f325/barbican-worker-log/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.749522 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-f5cp2_a28052ee-43d2-4618-a981-ef115a2c3a00/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.800965 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/ceilometer-central-agent/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.884542 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/ceilometer-notification-agent/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.935573 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/proxy-httpd/0.log" Dec 05 13:27:59 crc kubenswrapper[4763]: I1205 13:27:59.998364 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948c2855-16a9-47e2-96a4-70fe90181d9e/sg-core/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.148849 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4027bf13-4c83-4281-8a0d-d18c6032e0af/cinder-api-log/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.208136 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4027bf13-4c83-4281-8a0d-d18c6032e0af/cinder-api/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.361583 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd14a5e3-8def-4bc7-b375-8ae87dd75838/cinder-scheduler/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.420201 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd14a5e3-8def-4bc7-b375-8ae87dd75838/probe/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.475971 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rl4hr_e3bafcb4-8ef9-4670-8202-f5c61d6d4c33/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.657653 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gzt69_9169edb0-a8a3-4953-8472-6e496fced2e6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.699647 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/init/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.871098 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/init/0.log" Dec 05 13:28:00 crc kubenswrapper[4763]: I1205 13:28:00.916278 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-qlwfh_8c7e581d-5684-4557-96f8-5502a00e1da1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.001546 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-qtlzv_dd604323-58e1-439a-b0a4-66ad626de5a6/dnsmasq-dns/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.193977 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_29573883-0e6d-40b3-9a6f-39308d6db246/glance-log/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.197689 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_29573883-0e6d-40b3-9a6f-39308d6db246/glance-httpd/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.331699 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cdedbb4d-c325-420f-946f-942359580cfe/glance-httpd/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.408911 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cdedbb4d-c325-420f-946f-942359580cfe/glance-log/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.631543 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77dcd5c496-hs7bj_b34428a2-5423-401a-b7d3-aebd1d070945/horizon/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.737118 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7lnl9_5d234676-63b0-4c1c-804f-93d938e0ed84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:01 crc kubenswrapper[4763]: I1205 13:28:01.909740 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-k85hd_4b7e81d2-79b9-4368-bb1f-fa3fda3f5f0b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:02 crc kubenswrapper[4763]: I1205 13:28:02.159403 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77dcd5c496-hs7bj_b34428a2-5423-401a-b7d3-aebd1d070945/horizon-log/0.log" Dec 05 13:28:02 crc kubenswrapper[4763]: I1205 13:28:02.256924 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415661-qdvsb_9c1231bc-00fe-4fb3-9fb7-7121743e17c9/keystone-cron/0.log" Dec 05 13:28:02 crc kubenswrapper[4763]: I1205 13:28:02.485229 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4521fb51-39ad-4717-8239-8d2a759d4a30/kube-state-metrics/0.log" Dec 05 13:28:02 crc kubenswrapper[4763]: I1205 13:28:02.523740 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7fc67b9475-mqldq_2d728472-3cda-480b-b5dc-065969434f7d/keystone-api/0.log" Dec 05 13:28:02 crc kubenswrapper[4763]: I1205 13:28:02.681346 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cr9r4_4355ed47-63c1-47e1-81e6-33d33f89b5a7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:03 crc kubenswrapper[4763]: I1205 13:28:03.065900 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nmkr5_d790dbae-6bb4-4b37-b9bd-0ba454c8fa83/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:03 crc kubenswrapper[4763]: I1205 13:28:03.108910 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b55c974d9-brgnw_6d6c980e-688d-41b3-a7ad-0061b07b9494/neutron-httpd/0.log" Dec 05 13:28:03 crc kubenswrapper[4763]: I1205 13:28:03.143221 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b55c974d9-brgnw_6d6c980e-688d-41b3-a7ad-0061b07b9494/neutron-api/0.log" Dec 05 13:28:03 crc kubenswrapper[4763]: I1205 13:28:03.838529 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4319bf0f-65c6-401b-96dd-53e10a73c011/nova-cell0-conductor-conductor/0.log" Dec 05 13:28:04 crc kubenswrapper[4763]: I1205 13:28:04.161037 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0f31e12e-2f94-40a3-a522-1aa44cb1cdbf/nova-cell1-conductor-conductor/0.log" Dec 05 13:28:04 crc kubenswrapper[4763]: I1205 13:28:04.429456 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c3577198-f0dd-4145-a9c5-d29a0d18d212/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 13:28:04 crc kubenswrapper[4763]: I1205 13:28:04.443204 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2ad0e748-bb1a-4b4f-bc70-f059e4fc3614/nova-api-log/0.log" Dec 05 13:28:04 crc kubenswrapper[4763]: I1205 13:28:04.648054 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mrrcd_e3335529-4636-46d2-b949-1d02a4c43ee0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:04 crc kubenswrapper[4763]: I1205 13:28:04.733103 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb8fec18-c1b6-47de-91cc-7ef68caceb0e/nova-metadata-log/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.109885 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2ad0e748-bb1a-4b4f-bc70-f059e4fc3614/nova-api-api/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.330573 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/mysql-bootstrap/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.349188 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e011a0f4-fec9-4c12-a229-2e63ef03037d/nova-scheduler-scheduler/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.522895 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/mysql-bootstrap/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.580235 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a066fad3-20a3-41d6-852d-7196f8445e2a/galera/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.753234 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/mysql-bootstrap/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.931582 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/galera/0.log" Dec 05 13:28:05 crc kubenswrapper[4763]: I1205 13:28:05.952113 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d5f22311-2f88-40cb-a35d-e0609433db1a/mysql-bootstrap/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.113629 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9e894c53-51db-4ede-9730-b8c68ad6fc15/openstackclient/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.220109 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6gw4w_c9acbf99-ec01-4de6-9d45-418664511586/ovn-controller/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.433267 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-87zsg_08357e6b-d21e-4b50-8af2-22ddc7398fbc/openstack-network-exporter/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.584521 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server-init/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.811818 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server-init/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.835522 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovs-vswitchd/0.log" Dec 05 13:28:06 crc kubenswrapper[4763]: I1205 13:28:06.844502 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzkm7_80ec3b73-a380-499b-b4d1-054a6b2ab4a6/ovsdb-server/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.105078 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sl4g2_688c0399-83be-44e3-adc0-4288525a9f4b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.287994 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c/openstack-network-exporter/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.292310 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f0842eda-d10a-4bb6-9f7d-a2d83fd62e2c/ovn-northd/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.297938 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb8fec18-c1b6-47de-91cc-7ef68caceb0e/nova-metadata-metadata/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.458290 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9680c542-fe6f-42cb-b48d-e17b80916e50/openstack-network-exporter/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.505953 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9680c542-fe6f-42cb-b48d-e17b80916e50/ovsdbserver-nb/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.573748 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1/openstack-network-exporter/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.693618 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a8ef0812-ab9d-4b43-8eb2-0543a4c6c8a1/ovsdbserver-sb/0.log" Dec 05 13:28:07 crc kubenswrapper[4763]: I1205 13:28:07.941477 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/init-config-reloader/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.026216 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76d66464d-r24j6_7e5e41f7-ee3c-4587-bae0-5716c12c84b6/placement-api/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.065052 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76d66464d-r24j6_7e5e41f7-ee3c-4587-bae0-5716c12c84b6/placement-log/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.241391 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/init-config-reloader/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.272066 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/config-reloader/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.276036 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/thanos-sidecar/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.360925 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_aac1b695-9685-4f3f-bc5d-d1262bb44992/prometheus/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.505542 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/setup-container/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.769821 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/setup-container/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.803111 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0c59087a-448f-41c2-a85b-6ccd0ddbecc1/rabbitmq/0.log" Dec 05 13:28:08 crc kubenswrapper[4763]: I1205 13:28:08.867063 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/setup-container/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.069783 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/setup-container/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.092088 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2041e23-d29c-4a1a-9787-aa0e19c9f764/rabbitmq/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.155342 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mfww4_576fa469-1138-4580-b637-66ec5a5e101e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.386438 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-56jk4_783891d5-537e-4f2f-b3ee-326588a913f6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.396488 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-4ghbz_fd34c478-732e-49a0-ab4a-c35fdf054b3c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.784664 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7r8pv_b73d98c2-daca-4632-9c2e-1ab408ec4ac5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.793656 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:28:09 crc kubenswrapper[4763]: E1205 13:28:09.793929 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:28:09 crc kubenswrapper[4763]: I1205 13:28:09.929002 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gtkd8_23d776eb-9b6f-439e-8938-2aea4708e154/ssh-known-hosts-edpm-deployment/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.048126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4499b47f-s4mh4_fe2a82f8-601f-42ea-a495-4d1a03084267/proxy-server/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.205498 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4499b47f-s4mh4_fe2a82f8-601f-42ea-a495-4d1a03084267/proxy-httpd/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.286781 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pzxb5_a38e41f6-6247-4c91-abba-0bc65d1c2127/swift-ring-rebalance/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.430655 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-reaper/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.438942 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-auditor/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.553100 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-replicator/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.627257 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/account-server/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.721969 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-auditor/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.743670 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-replicator/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.751649 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-server/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.873832 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/container-updater/0.log" Dec 05 13:28:10 crc kubenswrapper[4763]: I1205 13:28:10.974715 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-replicator/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.001653 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-expirer/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.018184 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-auditor/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.146580 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-server/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.169125 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/object-updater/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.178544 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/rsync/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.278842 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1851124e-2722-4628-8e5b-63edb828d64a/swift-recon-cron/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.442023 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mk2sm_f5d27328-7e5a-4664-9c0a-ae5c063ec8b9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.528969 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_295e994b-9be5-4486-beb7-6be00576c5c3/tempest-tests-tempest-tests-runner/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.692013 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9a56e3c4-fdad-4c05-b4f4-9a155afc3239/test-operator-logs-container/0.log" Dec 05 13:28:11 crc kubenswrapper[4763]: I1205 13:28:11.719371 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6d8v9_1ed4f328-73dd-4e34-91c4-b68898c59d74/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 13:28:12 crc kubenswrapper[4763]: I1205 13:28:12.556124 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_8bdc880b-2a69-4a8c-920f-6b0b04c7ecfe/watcher-applier/0.log" Dec 05 13:28:12 crc kubenswrapper[4763]: I1205 13:28:12.776357 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_37a4b06b-53bd-4f53-89b7-4d5a53554510/watcher-api-log/0.log" Dec 05 13:28:13 crc kubenswrapper[4763]: I1205 13:28:13.557586 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_819c6a72-3e2a-4445-8abf-1a10f8eaab9b/watcher-decision-engine/0.log" Dec 05 13:28:16 crc kubenswrapper[4763]: I1205 13:28:16.205278 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_37a4b06b-53bd-4f53-89b7-4d5a53554510/watcher-api/0.log" Dec 05 13:28:17 crc kubenswrapper[4763]: I1205 13:28:17.184899 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cc4cab9d-2172-424c-88ca-962ec052d0c3/memcached/0.log" Dec 05 13:28:22 crc kubenswrapper[4763]: I1205 13:28:22.784172 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:28:22 crc kubenswrapper[4763]: E1205 13:28:22.784942 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:28:34 crc kubenswrapper[4763]: I1205 13:28:34.783808 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:28:34 crc kubenswrapper[4763]: E1205 13:28:34.784670 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.835020 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:38 crc kubenswrapper[4763]: E1205 13:28:38.836699 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d00354-d26f-40e1-8426-97fc5cb77cd1" containerName="container-00" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.836795 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d00354-d26f-40e1-8426-97fc5cb77cd1" containerName="container-00" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.837187 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d00354-d26f-40e1-8426-97fc5cb77cd1" containerName="container-00" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.852908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.856227 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.902974 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.938033 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.938127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:38 crc kubenswrapper[4763]: I1205 13:28:38.938199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnblg\" (UniqueName: \"kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.039973 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.040083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnblg\" (UniqueName: \"kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.040172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.040464 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.040522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.072182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnblg\" (UniqueName: \"kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg\") pod \"redhat-marketplace-7zb56\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.180047 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.182663 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.184584 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.292999 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.472341 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/pull/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.479168 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/util/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.531646 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10518271d11356773275a244d183abb577b9dea821837b29079f1397f0xq6xx_262113c6-3029-4c0d-8279-e1454a535c24/extract/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.679896 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.684332 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-f92rg_d7dd9586-7cc5-42f0-87a8-3a8c54557b21/kube-rbac-proxy/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.718247 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-f92rg_d7dd9586-7cc5-42f0-87a8-3a8c54557b21/manager/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.797253 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w8w7f_e97d9ee8-0c07-486a-84f1-dabddb037a8b/kube-rbac-proxy/0.log" Dec 05 13:28:39 crc kubenswrapper[4763]: I1205 13:28:39.978288 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w8w7f_e97d9ee8-0c07-486a-84f1-dabddb037a8b/manager/0.log" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.232511 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.235118 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.244672 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.307595 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8gw8h_f9a5212c-2ddb-4e82-818e-5102fb3c5ee2/manager/0.log" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.373236 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8gw8h_f9a5212c-2ddb-4e82-818e-5102fb3c5ee2/kube-rbac-proxy/0.log" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.375860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4fp\" (UniqueName: \"kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.375926 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.375974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.478690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4fp\" (UniqueName: \"kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.479025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.479050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.479610 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.479636 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.505347 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4fp\" (UniqueName: \"kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp\") pod \"community-operators-4mdln\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.555321 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-77g97_94ae68ae-93ae-43a6-89fa-5b2301808793/kube-rbac-proxy/0.log" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.565470 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.704380 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-77g97_94ae68ae-93ae-43a6-89fa-5b2301808793/manager/0.log" Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.724308 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerID="89312462ebb8ffb469dd3db574a45fd2c678618098d12126efee1fe68e4e3dea" exitCode=0 Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.724351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerDied","Data":"89312462ebb8ffb469dd3db574a45fd2c678618098d12126efee1fe68e4e3dea"} Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.724375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerStarted","Data":"e0b99d9d9cab6b631b96944a5fb34d406683623e82fe37ea338c7741eb35a2a2"} Dec 05 13:28:40 crc kubenswrapper[4763]: I1205 13:28:40.955735 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mq9f4_42f31714-9ede-4c48-b611-028a79374fad/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.109351 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mq9f4_42f31714-9ede-4c48-b611-028a79374fad/manager/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.158824 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gbf44_2bed16d5-ec79-4ad7-8984-b965fa568dc6/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.185380 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gbf44_2bed16d5-ec79-4ad7-8984-b965fa568dc6/manager/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.256899 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:28:41 crc kubenswrapper[4763]: W1205 13:28:41.276600 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf992e4d2_b6e9_4949_92d4_98ad36e37e1f.slice/crio-cf5b6dd5a93d492dfd5cdbd41135d4efaa759120c1ebe32979c3345ceed64294 WatchSource:0}: Error finding container cf5b6dd5a93d492dfd5cdbd41135d4efaa759120c1ebe32979c3345ceed64294: Status 404 returned error can't find the container with id cf5b6dd5a93d492dfd5cdbd41135d4efaa759120c1ebe32979c3345ceed64294 Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.330835 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-p5jgv_f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.525723 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-2bgt8_ed1b8d49-d742-4493-bb7e-856b4108fb88/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.548090 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-p5jgv_f1fb13e0-bcd9-4cfe-be8a-a33f0e452fb6/manager/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.572263 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-2bgt8_ed1b8d49-d742-4493-bb7e-856b4108fb88/manager/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.737957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerStarted","Data":"3076b046a2ff4e29ba0952643df344b93dbb5a1a7d41bc7f75a72967861552d8"} Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.739937 4763 generic.go:334] "Generic (PLEG): container finished" podID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerID="e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa" exitCode=0 Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.740037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerDied","Data":"e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa"} Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.740280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerStarted","Data":"cf5b6dd5a93d492dfd5cdbd41135d4efaa759120c1ebe32979c3345ceed64294"} Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.771793 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gx4vn_6520d187-9c6c-4b0e-b0c9-27e23db84f4c/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.846439 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gx4vn_6520d187-9c6c-4b0e-b0c9-27e23db84f4c/manager/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.977612 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-sq9wf_eb3c8b38-a863-42d0-b7d8-03231971e4ce/kube-rbac-proxy/0.log" Dec 05 13:28:41 crc kubenswrapper[4763]: I1205 13:28:41.993986 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-sq9wf_eb3c8b38-a863-42d0-b7d8-03231971e4ce/manager/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.047405 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-b9w6n_cd42325e-d26d-4cb6-b8dd-f75dc86e7568/kube-rbac-proxy/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.164279 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-b9w6n_cd42325e-d26d-4cb6-b8dd-f75dc86e7568/manager/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.266700 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_4faed118-8b9d-4adb-8f86-6a6be8061bce/kube-rbac-proxy/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.356836 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vqmk4_4faed118-8b9d-4adb-8f86-6a6be8061bce/manager/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.445457 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mnthh_378cb9d9-8010-4dcf-9297-5e4f0679086e/kube-rbac-proxy/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.579046 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mnthh_378cb9d9-8010-4dcf-9297-5e4f0679086e/manager/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.599436 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6zk6f_fcc46489-05d5-4219-9e45-6ca25f25900f/kube-rbac-proxy/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.642532 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6zk6f_fcc46489-05d5-4219-9e45-6ca25f25900f/manager/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.749728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs_3fdf0ecb-215d-4a02-8053-169fcbfefa50/kube-rbac-proxy/0.log" Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.750619 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerID="3076b046a2ff4e29ba0952643df344b93dbb5a1a7d41bc7f75a72967861552d8" exitCode=0 Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.750673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerDied","Data":"3076b046a2ff4e29ba0952643df344b93dbb5a1a7d41bc7f75a72967861552d8"} Dec 05 13:28:42 crc kubenswrapper[4763]: I1205 13:28:42.828449 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4cvzcs_3fdf0ecb-215d-4a02-8053-169fcbfefa50/manager/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.214291 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-654b7bd4cc-79gh5_9b87ef42-73e9-40c4-a64b-381de978398c/operator/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.275407 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jp2ck_522dda98-66e1-4ced-b504-e957eb00cda2/registry-server/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.524610 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mch9f_63e1e64f-8414-4da8-8a32-5f0a0041c5ff/kube-rbac-proxy/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.541606 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mch9f_63e1e64f-8414-4da8-8a32-5f0a0041c5ff/manager/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.602935 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gzx9n_a05e3d8d-f58a-44f0-b3c9-e212cdcec438/kube-rbac-proxy/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.764798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerStarted","Data":"d75fa20c0cb6e95e59bf57bf2f53945a86b2468c16e76b46959e9232a6c180a6"} Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.772545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerStarted","Data":"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1"} Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.780097 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gzx9n_a05e3d8d-f58a-44f0-b3c9-e212cdcec438/manager/0.log" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.794254 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7zb56" podStartSLOduration=3.315191933 podStartE2EDuration="5.794233791s" podCreationTimestamp="2025-12-05 13:28:38 +0000 UTC" firstStartedPulling="2025-12-05 13:28:40.72869688 +0000 UTC m=+6005.221411603" lastFinishedPulling="2025-12-05 13:28:43.207738738 +0000 UTC m=+6007.700453461" observedRunningTime="2025-12-05 13:28:43.791691953 +0000 UTC m=+6008.284406676" watchObservedRunningTime="2025-12-05 13:28:43.794233791 +0000 UTC m=+6008.286948524" Dec 05 13:28:43 crc kubenswrapper[4763]: I1205 13:28:43.886937 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-n8fzm_01d1c35a-adc3-4945-92b5-5921600cb826/operator/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.062169 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-phnl7_2024cb36-8175-4993-bd5b-a57a8fb8416c/kube-rbac-proxy/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.129474 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-phnl7_2024cb36-8175-4993-bd5b-a57a8fb8416c/manager/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.145080 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d6cc4d8dc-wf9nm_43f191ee-e0a3-4d9e-a63a-c9b7a626806f/manager/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.187867 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xkl6w_b64b19c9-3601-4790-addf-c9a32f6c29fe/kube-rbac-proxy/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.361589 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ht96c_0a36f8ad-7e41-4005-a42e-47b9a30af62f/kube-rbac-proxy/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.429463 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ht96c_0a36f8ad-7e41-4005-a42e-47b9a30af62f/manager/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.491439 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xkl6w_b64b19c9-3601-4790-addf-c9a32f6c29fe/manager/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.606934 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66974974bb-mjwrw_36e19ef2-df0d-43ca-8477-f1cec2182b45/kube-rbac-proxy/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.668584 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66974974bb-mjwrw_36e19ef2-df0d-43ca-8477-f1cec2182b45/manager/0.log" Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.783397 4763 generic.go:334] "Generic (PLEG): container finished" podID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerID="cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1" exitCode=0 Dec 05 13:28:44 crc kubenswrapper[4763]: I1205 13:28:44.783461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerDied","Data":"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1"} Dec 05 13:28:46 crc kubenswrapper[4763]: I1205 13:28:46.784870 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:28:46 crc kubenswrapper[4763]: E1205 13:28:46.785393 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:28:46 crc kubenswrapper[4763]: I1205 13:28:46.803688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerStarted","Data":"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358"} Dec 05 13:28:46 crc kubenswrapper[4763]: I1205 13:28:46.822548 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4mdln" podStartSLOduration=2.613046434 podStartE2EDuration="6.822516313s" podCreationTimestamp="2025-12-05 13:28:40 +0000 UTC" firstStartedPulling="2025-12-05 13:28:41.741820096 +0000 UTC m=+6006.234534819" lastFinishedPulling="2025-12-05 13:28:45.951289935 +0000 UTC m=+6010.444004698" observedRunningTime="2025-12-05 13:28:46.820289433 +0000 UTC m=+6011.313004156" watchObservedRunningTime="2025-12-05 13:28:46.822516313 +0000 UTC m=+6011.315231026" Dec 05 13:28:49 crc kubenswrapper[4763]: I1205 13:28:49.182951 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:49 crc kubenswrapper[4763]: I1205 13:28:49.183510 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:49 crc kubenswrapper[4763]: I1205 13:28:49.236543 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:49 crc kubenswrapper[4763]: I1205 13:28:49.881327 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:50 crc kubenswrapper[4763]: I1205 13:28:50.566168 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:50 crc kubenswrapper[4763]: I1205 13:28:50.566232 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:50 crc kubenswrapper[4763]: I1205 13:28:50.620322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.630270 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.632290 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.653180 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.797823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.797917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.797976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2lm5\" (UniqueName: \"kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.899430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.899781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2lm5\" (UniqueName: \"kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.899921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.900665 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.900828 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.929839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2lm5\" (UniqueName: \"kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5\") pod \"certified-operators-hpptj\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:51 crc kubenswrapper[4763]: I1205 13:28:51.949165 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:28:52 crc kubenswrapper[4763]: I1205 13:28:52.506986 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:28:52 crc kubenswrapper[4763]: I1205 13:28:52.870380 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerID="68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988" exitCode=0 Dec 05 13:28:52 crc kubenswrapper[4763]: I1205 13:28:52.870424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerDied","Data":"68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988"} Dec 05 13:28:52 crc kubenswrapper[4763]: I1205 13:28:52.870652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerStarted","Data":"cab56e7821e778df940c8f2e01dcb2ff5985134affd1a939920d630c57f01a18"} Dec 05 13:28:54 crc kubenswrapper[4763]: I1205 13:28:54.420933 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:54 crc kubenswrapper[4763]: I1205 13:28:54.421392 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7zb56" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="registry-server" containerID="cri-o://d75fa20c0cb6e95e59bf57bf2f53945a86b2468c16e76b46959e9232a6c180a6" gracePeriod=2 Dec 05 13:28:54 crc kubenswrapper[4763]: I1205 13:28:54.888450 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerStarted","Data":"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3"} Dec 05 13:28:56 crc kubenswrapper[4763]: I1205 13:28:56.909259 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerID="d75fa20c0cb6e95e59bf57bf2f53945a86b2468c16e76b46959e9232a6c180a6" exitCode=0 Dec 05 13:28:56 crc kubenswrapper[4763]: I1205 13:28:56.909337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerDied","Data":"d75fa20c0cb6e95e59bf57bf2f53945a86b2468c16e76b46959e9232a6c180a6"} Dec 05 13:28:56 crc kubenswrapper[4763]: I1205 13:28:56.911587 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerID="a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3" exitCode=0 Dec 05 13:28:56 crc kubenswrapper[4763]: I1205 13:28:56.911633 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerDied","Data":"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3"} Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.607031 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.716646 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content\") pod \"a9a7bb7d-286a-4046-a017-371a0d0d4789\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.716774 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities\") pod \"a9a7bb7d-286a-4046-a017-371a0d0d4789\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.716848 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnblg\" (UniqueName: \"kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg\") pod \"a9a7bb7d-286a-4046-a017-371a0d0d4789\" (UID: \"a9a7bb7d-286a-4046-a017-371a0d0d4789\") " Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.717461 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities" (OuterVolumeSpecName: "utilities") pod "a9a7bb7d-286a-4046-a017-371a0d0d4789" (UID: "a9a7bb7d-286a-4046-a017-371a0d0d4789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.724470 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg" (OuterVolumeSpecName: "kube-api-access-jnblg") pod "a9a7bb7d-286a-4046-a017-371a0d0d4789" (UID: "a9a7bb7d-286a-4046-a017-371a0d0d4789"). InnerVolumeSpecName "kube-api-access-jnblg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.729670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9a7bb7d-286a-4046-a017-371a0d0d4789" (UID: "a9a7bb7d-286a-4046-a017-371a0d0d4789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.818612 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.818857 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9a7bb7d-286a-4046-a017-371a0d0d4789-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.818867 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnblg\" (UniqueName: \"kubernetes.io/projected/a9a7bb7d-286a-4046-a017-371a0d0d4789-kube-api-access-jnblg\") on node \"crc\" DevicePath \"\"" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.923851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7zb56" event={"ID":"a9a7bb7d-286a-4046-a017-371a0d0d4789","Type":"ContainerDied","Data":"e0b99d9d9cab6b631b96944a5fb34d406683623e82fe37ea338c7741eb35a2a2"} Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.924213 4763 scope.go:117] "RemoveContainer" containerID="d75fa20c0cb6e95e59bf57bf2f53945a86b2468c16e76b46959e9232a6c180a6" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.923922 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7zb56" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.926188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerStarted","Data":"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646"} Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.945149 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpptj" podStartSLOduration=2.377564977 podStartE2EDuration="6.945124885s" podCreationTimestamp="2025-12-05 13:28:51 +0000 UTC" firstStartedPulling="2025-12-05 13:28:52.871928896 +0000 UTC m=+6017.364643619" lastFinishedPulling="2025-12-05 13:28:57.439488804 +0000 UTC m=+6021.932203527" observedRunningTime="2025-12-05 13:28:57.942462853 +0000 UTC m=+6022.435177586" watchObservedRunningTime="2025-12-05 13:28:57.945124885 +0000 UTC m=+6022.437839608" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.957783 4763 scope.go:117] "RemoveContainer" containerID="3076b046a2ff4e29ba0952643df344b93dbb5a1a7d41bc7f75a72967861552d8" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.979832 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.988559 4763 scope.go:117] "RemoveContainer" containerID="89312462ebb8ffb469dd3db574a45fd2c678618098d12126efee1fe68e4e3dea" Dec 05 13:28:57 crc kubenswrapper[4763]: I1205 13:28:57.992220 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7zb56"] Dec 05 13:28:59 crc kubenswrapper[4763]: I1205 13:28:59.796570 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" path="/var/lib/kubelet/pods/a9a7bb7d-286a-4046-a017-371a0d0d4789/volumes" Dec 05 13:29:00 crc kubenswrapper[4763]: I1205 13:29:00.607376 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:29:00 crc kubenswrapper[4763]: I1205 13:29:00.783881 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:29:00 crc kubenswrapper[4763]: E1205 13:29:00.784121 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:29:01 crc kubenswrapper[4763]: I1205 13:29:01.950214 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:01 crc kubenswrapper[4763]: I1205 13:29:01.953997 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:02 crc kubenswrapper[4763]: I1205 13:29:02.019288 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.044111 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.418680 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.418917 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4mdln" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="registry-server" containerID="cri-o://1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358" gracePeriod=2 Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.889560 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.983330 4763 generic.go:334] "Generic (PLEG): container finished" podID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerID="1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358" exitCode=0 Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.983447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerDied","Data":"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358"} Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.983503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mdln" event={"ID":"f992e4d2-b6e9-4949-92d4-98ad36e37e1f","Type":"ContainerDied","Data":"cf5b6dd5a93d492dfd5cdbd41135d4efaa759120c1ebe32979c3345ceed64294"} Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.983523 4763 scope.go:117] "RemoveContainer" containerID="1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358" Dec 05 13:29:03 crc kubenswrapper[4763]: I1205 13:29:03.983992 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mdln" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.003155 4763 scope.go:117] "RemoveContainer" containerID="cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.030615 4763 scope.go:117] "RemoveContainer" containerID="e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.037716 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk4fp\" (UniqueName: \"kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp\") pod \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.037996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities\") pod \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.038030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content\") pod \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\" (UID: \"f992e4d2-b6e9-4949-92d4-98ad36e37e1f\") " Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.038587 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities" (OuterVolumeSpecName: "utilities") pod "f992e4d2-b6e9-4949-92d4-98ad36e37e1f" (UID: "f992e4d2-b6e9-4949-92d4-98ad36e37e1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.044498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp" (OuterVolumeSpecName: "kube-api-access-qk4fp") pod "f992e4d2-b6e9-4949-92d4-98ad36e37e1f" (UID: "f992e4d2-b6e9-4949-92d4-98ad36e37e1f"). InnerVolumeSpecName "kube-api-access-qk4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.085383 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f992e4d2-b6e9-4949-92d4-98ad36e37e1f" (UID: "f992e4d2-b6e9-4949-92d4-98ad36e37e1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.135274 4763 scope.go:117] "RemoveContainer" containerID="1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358" Dec 05 13:29:04 crc kubenswrapper[4763]: E1205 13:29:04.135829 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358\": container with ID starting with 1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358 not found: ID does not exist" containerID="1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.135873 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358"} err="failed to get container status \"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358\": rpc error: code = NotFound desc = could not find container \"1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358\": container with ID starting with 1341310edd834a6df2cd0158c98bd02c09fb7bfe98a7f3ce46eb9b2dfdcfe358 not found: ID does not exist" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.135902 4763 scope.go:117] "RemoveContainer" containerID="cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1" Dec 05 13:29:04 crc kubenswrapper[4763]: E1205 13:29:04.136289 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1\": container with ID starting with cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1 not found: ID does not exist" containerID="cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.136355 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1"} err="failed to get container status \"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1\": rpc error: code = NotFound desc = could not find container \"cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1\": container with ID starting with cecf14154ce0942434f3b74924498c95df90a1343c4f31769d89360036087be1 not found: ID does not exist" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.136387 4763 scope.go:117] "RemoveContainer" containerID="e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa" Dec 05 13:29:04 crc kubenswrapper[4763]: E1205 13:29:04.136786 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa\": container with ID starting with e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa not found: ID does not exist" containerID="e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.136832 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa"} err="failed to get container status \"e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa\": rpc error: code = NotFound desc = could not find container \"e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa\": container with ID starting with e87b2c42e127940979c0877333ed554f61776159be751787082f2bdad5aecdfa not found: ID does not exist" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.140464 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.140490 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.140501 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk4fp\" (UniqueName: \"kubernetes.io/projected/f992e4d2-b6e9-4949-92d4-98ad36e37e1f-kube-api-access-qk4fp\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.322677 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.335757 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4mdln"] Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.424974 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:29:04 crc kubenswrapper[4763]: I1205 13:29:04.995504 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpptj" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="registry-server" containerID="cri-o://41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646" gracePeriod=2 Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.306258 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7wmpg_f08e9226-6ec5-4854-9780-0b5e2d8a7ded/control-plane-machine-set-operator/0.log" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.505047 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j5ztd_0c9b5acf-ef6a-4bdd-ae32-582a80d711b5/machine-api-operator/0.log" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.545442 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j5ztd_0c9b5acf-ef6a-4bdd-ae32-582a80d711b5/kube-rbac-proxy/0.log" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.800375 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" path="/var/lib/kubelet/pods/f992e4d2-b6e9-4949-92d4-98ad36e37e1f/volumes" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.835524 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.977444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities\") pod \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.977495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2lm5\" (UniqueName: \"kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5\") pod \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.977638 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content\") pod \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\" (UID: \"5b579ff1-5985-4ccf-9e6a-4af23b482be4\") " Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.979084 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities" (OuterVolumeSpecName: "utilities") pod "5b579ff1-5985-4ccf-9e6a-4af23b482be4" (UID: "5b579ff1-5985-4ccf-9e6a-4af23b482be4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:29:05 crc kubenswrapper[4763]: I1205 13:29:05.984536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5" (OuterVolumeSpecName: "kube-api-access-f2lm5") pod "5b579ff1-5985-4ccf-9e6a-4af23b482be4" (UID: "5b579ff1-5985-4ccf-9e6a-4af23b482be4"). InnerVolumeSpecName "kube-api-access-f2lm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.008566 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerID="41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646" exitCode=0 Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.008609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerDied","Data":"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646"} Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.008638 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpptj" event={"ID":"5b579ff1-5985-4ccf-9e6a-4af23b482be4","Type":"ContainerDied","Data":"cab56e7821e778df940c8f2e01dcb2ff5985134affd1a939920d630c57f01a18"} Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.008659 4763 scope.go:117] "RemoveContainer" containerID="41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.008792 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpptj" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.045718 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b579ff1-5985-4ccf-9e6a-4af23b482be4" (UID: "5b579ff1-5985-4ccf-9e6a-4af23b482be4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.070451 4763 scope.go:117] "RemoveContainer" containerID="a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.079635 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.079855 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b579ff1-5985-4ccf-9e6a-4af23b482be4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.079959 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2lm5\" (UniqueName: \"kubernetes.io/projected/5b579ff1-5985-4ccf-9e6a-4af23b482be4-kube-api-access-f2lm5\") on node \"crc\" DevicePath \"\"" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.102442 4763 scope.go:117] "RemoveContainer" containerID="68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.143646 4763 scope.go:117] "RemoveContainer" containerID="41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646" Dec 05 13:29:06 crc kubenswrapper[4763]: E1205 13:29:06.145164 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646\": container with ID starting with 41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646 not found: ID does not exist" containerID="41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.145206 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646"} err="failed to get container status \"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646\": rpc error: code = NotFound desc = could not find container \"41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646\": container with ID starting with 41731ed9e8b1afbd04d151e18cb2ac6030f969c6655ccbeac8c0faaa71f2d646 not found: ID does not exist" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.145237 4763 scope.go:117] "RemoveContainer" containerID="a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3" Dec 05 13:29:06 crc kubenswrapper[4763]: E1205 13:29:06.146932 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3\": container with ID starting with a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3 not found: ID does not exist" containerID="a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.146955 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3"} err="failed to get container status \"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3\": rpc error: code = NotFound desc = could not find container \"a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3\": container with ID starting with a0f165bb649c47ee345db9b8197dee747a0e0a5ffe9f5d0af0b90072b0859ea3 not found: ID does not exist" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.146975 4763 scope.go:117] "RemoveContainer" containerID="68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988" Dec 05 13:29:06 crc kubenswrapper[4763]: E1205 13:29:06.150346 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988\": container with ID starting with 68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988 not found: ID does not exist" containerID="68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.150380 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988"} err="failed to get container status \"68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988\": rpc error: code = NotFound desc = could not find container \"68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988\": container with ID starting with 68b1506911fe03425d66e14c08b878eba5208ba3ac28c867f9dd2f8693dd3988 not found: ID does not exist" Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.344594 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:29:06 crc kubenswrapper[4763]: I1205 13:29:06.354114 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpptj"] Dec 05 13:29:07 crc kubenswrapper[4763]: I1205 13:29:07.802833 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" path="/var/lib/kubelet/pods/5b579ff1-5985-4ccf-9e6a-4af23b482be4/volumes" Dec 05 13:29:11 crc kubenswrapper[4763]: I1205 13:29:11.785274 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:29:11 crc kubenswrapper[4763]: E1205 13:29:11.787007 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:29:19 crc kubenswrapper[4763]: I1205 13:29:19.979150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dcg55_96780413-b18b-4d1d-a6c4-2bebb60c99c1/cert-manager-controller/0.log" Dec 05 13:29:20 crc kubenswrapper[4763]: I1205 13:29:20.187104 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5x8rx_6f795519-6cee-426c-8dda-7f96ef62a9a1/cert-manager-webhook/0.log" Dec 05 13:29:20 crc kubenswrapper[4763]: I1205 13:29:20.240091 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d5g4s_c47efa3a-fd06-4193-921d-11f8f5fb0eff/cert-manager-cainjector/0.log" Dec 05 13:29:26 crc kubenswrapper[4763]: I1205 13:29:26.784181 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:29:26 crc kubenswrapper[4763]: E1205 13:29:26.784986 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:29:33 crc kubenswrapper[4763]: I1205 13:29:33.921444 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-76lxj_a3cf5928-0003-41e3-baf7-670a1f186bde/nmstate-console-plugin/0.log" Dec 05 13:29:34 crc kubenswrapper[4763]: I1205 13:29:34.108638 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-t4n5p_99db9e57-5946-4f9b-8664-d9a7fbff7042/kube-rbac-proxy/0.log" Dec 05 13:29:34 crc kubenswrapper[4763]: I1205 13:29:34.126752 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-t4n5p_99db9e57-5946-4f9b-8664-d9a7fbff7042/nmstate-metrics/0.log" Dec 05 13:29:34 crc kubenswrapper[4763]: I1205 13:29:34.158528 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p7k9h_3d3db32e-6ad0-4e60-828f-74bdcc4cf6df/nmstate-handler/0.log" Dec 05 13:29:34 crc kubenswrapper[4763]: I1205 13:29:34.327466 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-72lb2_a5db489a-42dd-46c0-825d-5dc7065c9f29/nmstate-operator/0.log" Dec 05 13:29:34 crc kubenswrapper[4763]: I1205 13:29:34.393869 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-4rx2l_73c52155-582b-4ea6-8661-c03a3804fe2e/nmstate-webhook/0.log" Dec 05 13:29:37 crc kubenswrapper[4763]: I1205 13:29:37.784340 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:29:38 crc kubenswrapper[4763]: I1205 13:29:38.338698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237"} Dec 05 13:29:49 crc kubenswrapper[4763]: I1205 13:29:49.867744 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-scmzk_a34bf611-cb4c-44b4-bdf2-45a656edadc9/kube-rbac-proxy/0.log" Dec 05 13:29:49 crc kubenswrapper[4763]: I1205 13:29:49.973334 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-scmzk_a34bf611-cb4c-44b4-bdf2-45a656edadc9/controller/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.139454 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.273948 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.280826 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.293370 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.326914 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.568128 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.594321 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.601499 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.630446 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.753655 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-reloader/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.780339 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-metrics/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.802288 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/cp-frr-files/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.845489 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/controller/0.log" Dec 05 13:29:50 crc kubenswrapper[4763]: I1205 13:29:50.943925 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/frr-metrics/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.015682 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/kube-rbac-proxy/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.092387 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/kube-rbac-proxy-frr/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.275388 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/reloader/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.486509 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-qv5hl_cfa0736f-2856-4cfd-810f-d8fcd2bea7f6/frr-k8s-webhook-server/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.656073 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84d94bbc7d-rf87g_d47b8a4e-ccc5-41e4-855b-86fee8fed449/manager/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.867812 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b69b886bc-52sm5_e3b7dc32-b6b1-4087-9518-da66dd2c1839/webhook-server/0.log" Dec 05 13:29:51 crc kubenswrapper[4763]: I1205 13:29:51.927953 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2k2k4_6af0da26-fcd3-4eb1-97a2-e5beedf81d5b/kube-rbac-proxy/0.log" Dec 05 13:29:52 crc kubenswrapper[4763]: I1205 13:29:52.612338 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2k2k4_6af0da26-fcd3-4eb1-97a2-e5beedf81d5b/speaker/0.log" Dec 05 13:29:52 crc kubenswrapper[4763]: I1205 13:29:52.694041 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ms42j_537c61d5-e548-4c96-b7ed-24fcc061e9ac/frr/0.log" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.181266 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m"] Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182184 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182198 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182210 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182215 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182232 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182241 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="extract-utilities" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182262 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182269 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182283 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182289 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182302 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182308 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182315 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182321 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182335 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: E1205 13:30:00.182354 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182359 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="extract-content" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182537 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a7bb7d-286a-4046-a017-371a0d0d4789" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182548 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b579ff1-5985-4ccf-9e6a-4af23b482be4" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.182557 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f992e4d2-b6e9-4949-92d4-98ad36e37e1f" containerName="registry-server" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.183266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.187194 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.187439 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.197942 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m"] Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.308218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.308259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tbp\" (UniqueName: \"kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.308295 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.411058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.411098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tbp\" (UniqueName: \"kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.411140 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.412079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.430493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.451710 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tbp\" (UniqueName: \"kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp\") pod \"collect-profiles-29415690-n6g8m\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.505866 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:00 crc kubenswrapper[4763]: I1205 13:30:00.979435 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m"] Dec 05 13:30:00 crc kubenswrapper[4763]: W1205 13:30:00.986169 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e153e9f_4a34_4270_9e7e_88bd8e402ea5.slice/crio-612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea WatchSource:0}: Error finding container 612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea: Status 404 returned error can't find the container with id 612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea Dec 05 13:30:01 crc kubenswrapper[4763]: I1205 13:30:01.558501 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e153e9f-4a34-4270-9e7e-88bd8e402ea5" containerID="ae8c5e448e69ffe4aa3d187fa752e1b726dde5a9358409c600e203f0aebb3333" exitCode=0 Dec 05 13:30:01 crc kubenswrapper[4763]: I1205 13:30:01.558718 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" event={"ID":"4e153e9f-4a34-4270-9e7e-88bd8e402ea5","Type":"ContainerDied","Data":"ae8c5e448e69ffe4aa3d187fa752e1b726dde5a9358409c600e203f0aebb3333"} Dec 05 13:30:01 crc kubenswrapper[4763]: I1205 13:30:01.558799 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" event={"ID":"4e153e9f-4a34-4270-9e7e-88bd8e402ea5","Type":"ContainerStarted","Data":"612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea"} Dec 05 13:30:02 crc kubenswrapper[4763]: I1205 13:30:02.937408 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.061466 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tbp\" (UniqueName: \"kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp\") pod \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.061792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume\") pod \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.062517 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume\") pod \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\" (UID: \"4e153e9f-4a34-4270-9e7e-88bd8e402ea5\") " Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.063140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e153e9f-4a34-4270-9e7e-88bd8e402ea5" (UID: "4e153e9f-4a34-4270-9e7e-88bd8e402ea5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.068216 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp" (OuterVolumeSpecName: "kube-api-access-p7tbp") pod "4e153e9f-4a34-4270-9e7e-88bd8e402ea5" (UID: "4e153e9f-4a34-4270-9e7e-88bd8e402ea5"). InnerVolumeSpecName "kube-api-access-p7tbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.068384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4e153e9f-4a34-4270-9e7e-88bd8e402ea5" (UID: "4e153e9f-4a34-4270-9e7e-88bd8e402ea5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.164903 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tbp\" (UniqueName: \"kubernetes.io/projected/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-kube-api-access-p7tbp\") on node \"crc\" DevicePath \"\"" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.164938 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.164947 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e153e9f-4a34-4270-9e7e-88bd8e402ea5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.576930 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" event={"ID":"4e153e9f-4a34-4270-9e7e-88bd8e402ea5","Type":"ContainerDied","Data":"612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea"} Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.576995 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612f135ce1c5210574fb09afa2f9c56d6e17c2703e0614763aa7a68e9a0e7cea" Dec 05 13:30:03 crc kubenswrapper[4763]: I1205 13:30:03.577301 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-n6g8m" Dec 05 13:30:04 crc kubenswrapper[4763]: I1205 13:30:04.015883 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz"] Dec 05 13:30:04 crc kubenswrapper[4763]: I1205 13:30:04.027007 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-77hjz"] Dec 05 13:30:05 crc kubenswrapper[4763]: I1205 13:30:05.796891 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830" path="/var/lib/kubelet/pods/e2ba7f9f-19e8-4e9f-882f-eb42ac7f1830/volumes" Dec 05 13:30:05 crc kubenswrapper[4763]: I1205 13:30:05.919157 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.126013 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.201685 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.215080 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.411430 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/pull/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.438635 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/extract/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.472047 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkftkr_b7d19c02-ed03-4e76-951f-2032e0f23c7a/util/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.804338 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.954133 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.979805 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:30:06 crc kubenswrapper[4763]: I1205 13:30:06.987170 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.170426 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/pull/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.190263 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/extract/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.213910 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210r9mh9_552de8ea-aa26-40d4-a360-1eda3664ae62/util/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.384808 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.567265 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.569702 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.583084 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.803934 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/util/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.824590 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/pull/0.log" Dec 05 13:30:07 crc kubenswrapper[4763]: I1205 13:30:07.902685 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83h7jsw_a6028549-2dc7-44a5-b84c-fb74585f3b85/extract/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.007142 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.159262 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.202346 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.230564 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.343345 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-content/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.375329 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/extract-utilities/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.593955 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.875179 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.890525 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:30:08 crc kubenswrapper[4763]: I1205 13:30:08.927811 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.143799 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-utilities/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.171375 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/extract-content/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.184336 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n95pv_9b919307-32f9-4abf-807f-86ef3b67ff55/registry-server/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.439948 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cqpn4_8fc5438b-109a-4bf8-97a6-d5c49edbc395/marketplace-operator/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.699330 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.933011 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.963740 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.992387 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:30:09 crc kubenswrapper[4763]: I1205 13:30:09.992999 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9dpb_cdc48b79-aeea-4cb0-a97f-4d265bb401f6/registry-server/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.141449 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-utilities/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.201459 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.221654 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/extract-content/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.344085 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6nvlh_1131b919-ad3a-4a36-a38b-de089ae44458/registry-server/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.491821 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.492720 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.492857 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.818655 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-utilities/0.log" Dec 05 13:30:10 crc kubenswrapper[4763]: I1205 13:30:10.869008 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/extract-content/0.log" Dec 05 13:30:12 crc kubenswrapper[4763]: I1205 13:30:12.293743 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qr5ln_91b8e8ef-dab2-4d38-aeaa-7659945ef17e/registry-server/0.log" Dec 05 13:30:22 crc kubenswrapper[4763]: I1205 13:30:22.924526 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-zsfpr_e1ccdc7d-9781-4086-b0a7-7a777c943bcb/prometheus-operator/0.log" Dec 05 13:30:23 crc kubenswrapper[4763]: I1205 13:30:23.115968 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-747bcf56cd-56v8c_e723fc3f-3161-40a0-becd-a17210dbd266/prometheus-operator-admission-webhook/0.log" Dec 05 13:30:23 crc kubenswrapper[4763]: I1205 13:30:23.133126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-747bcf56cd-c6mzm_e25a7208-54a3-4a23-a355-8bbd34b81ace/prometheus-operator-admission-webhook/0.log" Dec 05 13:30:23 crc kubenswrapper[4763]: I1205 13:30:23.328688 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-djkw9_f8052a23-847b-4419-af86-e56c327c367b/operator/0.log" Dec 05 13:30:23 crc kubenswrapper[4763]: I1205 13:30:23.335629 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-zx5r7_1603ef68-55d9-49dc-bbe4-93b129fe1b29/perses-operator/0.log" Dec 05 13:30:54 crc kubenswrapper[4763]: I1205 13:30:54.740263 4763 scope.go:117] "RemoveContainer" containerID="1f49393cf8dd15fb9a0ef3c2f07928ae7b982f41d3ca42b9521d10ecd102d23d" Dec 05 13:32:07 crc kubenswrapper[4763]: I1205 13:32:07.544250 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:32:07 crc kubenswrapper[4763]: I1205 13:32:07.544728 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:32:21 crc kubenswrapper[4763]: I1205 13:32:21.138278 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" containerID="07fda1d92b4d191beb7a0fc53089db00bb92eaa09ba7c94bc8db6b73d809dbfc" exitCode=0 Dec 05 13:32:21 crc kubenswrapper[4763]: I1205 13:32:21.138384 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hksp6/must-gather-ncx8z" event={"ID":"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533","Type":"ContainerDied","Data":"07fda1d92b4d191beb7a0fc53089db00bb92eaa09ba7c94bc8db6b73d809dbfc"} Dec 05 13:32:21 crc kubenswrapper[4763]: I1205 13:32:21.139541 4763 scope.go:117] "RemoveContainer" containerID="07fda1d92b4d191beb7a0fc53089db00bb92eaa09ba7c94bc8db6b73d809dbfc" Dec 05 13:32:21 crc kubenswrapper[4763]: I1205 13:32:21.289071 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hksp6_must-gather-ncx8z_ffffebf6-452b-4be3-bfbf-8b1d1a6c4533/gather/0.log" Dec 05 13:32:32 crc kubenswrapper[4763]: I1205 13:32:32.861324 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hksp6/must-gather-ncx8z"] Dec 05 13:32:32 crc kubenswrapper[4763]: I1205 13:32:32.862201 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hksp6/must-gather-ncx8z" podUID="ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" containerName="copy" containerID="cri-o://f5fc28e02ad48f2b95fc4e8158479b1fb8d1cadfd4d9803f6d39432f9aafd761" gracePeriod=2 Dec 05 13:32:32 crc kubenswrapper[4763]: I1205 13:32:32.873291 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hksp6/must-gather-ncx8z"] Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.269879 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hksp6_must-gather-ncx8z_ffffebf6-452b-4be3-bfbf-8b1d1a6c4533/copy/0.log" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.270749 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" containerID="f5fc28e02ad48f2b95fc4e8158479b1fb8d1cadfd4d9803f6d39432f9aafd761" exitCode=143 Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.270846 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96f69543577c842b0bab40c02e67d083c0beeeada751c42eecb08e358c1b4b3" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.307628 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hksp6_must-gather-ncx8z_ffffebf6-452b-4be3-bfbf-8b1d1a6c4533/copy/0.log" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.308621 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.437059 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchqh\" (UniqueName: \"kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh\") pod \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.437124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output\") pod \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\" (UID: \"ffffebf6-452b-4be3-bfbf-8b1d1a6c4533\") " Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.458069 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh" (OuterVolumeSpecName: "kube-api-access-zchqh") pod "ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" (UID: "ffffebf6-452b-4be3-bfbf-8b1d1a6c4533"). InnerVolumeSpecName "kube-api-access-zchqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.539333 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchqh\" (UniqueName: \"kubernetes.io/projected/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-kube-api-access-zchqh\") on node \"crc\" DevicePath \"\"" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.628161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" (UID: "ffffebf6-452b-4be3-bfbf-8b1d1a6c4533"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.640688 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 13:32:33 crc kubenswrapper[4763]: I1205 13:32:33.797750 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffffebf6-452b-4be3-bfbf-8b1d1a6c4533" path="/var/lib/kubelet/pods/ffffebf6-452b-4be3-bfbf-8b1d1a6c4533/volumes" Dec 05 13:32:34 crc kubenswrapper[4763]: I1205 13:32:34.283185 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hksp6/must-gather-ncx8z" Dec 05 13:32:37 crc kubenswrapper[4763]: I1205 13:32:37.544407 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:32:37 crc kubenswrapper[4763]: I1205 13:32:37.545281 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:32:54 crc kubenswrapper[4763]: I1205 13:32:54.852877 4763 scope.go:117] "RemoveContainer" containerID="d06054db88ee6e91310ac385615ba56df10973b9e7f567c9da4f85cdcc0c2299" Dec 05 13:32:54 crc kubenswrapper[4763]: I1205 13:32:54.873723 4763 scope.go:117] "RemoveContainer" containerID="07fda1d92b4d191beb7a0fc53089db00bb92eaa09ba7c94bc8db6b73d809dbfc" Dec 05 13:32:54 crc kubenswrapper[4763]: I1205 13:32:54.998347 4763 scope.go:117] "RemoveContainer" containerID="f5fc28e02ad48f2b95fc4e8158479b1fb8d1cadfd4d9803f6d39432f9aafd761" Dec 05 13:33:07 crc kubenswrapper[4763]: I1205 13:33:07.544389 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:33:07 crc kubenswrapper[4763]: I1205 13:33:07.544959 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:33:07 crc kubenswrapper[4763]: I1205 13:33:07.545039 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:33:07 crc kubenswrapper[4763]: I1205 13:33:07.545807 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:33:07 crc kubenswrapper[4763]: I1205 13:33:07.545862 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237" gracePeriod=600 Dec 05 13:33:08 crc kubenswrapper[4763]: I1205 13:33:08.621413 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237" exitCode=0 Dec 05 13:33:08 crc kubenswrapper[4763]: I1205 13:33:08.621516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237"} Dec 05 13:33:08 crc kubenswrapper[4763]: I1205 13:33:08.622095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerStarted","Data":"d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773"} Dec 05 13:33:08 crc kubenswrapper[4763]: I1205 13:33:08.622125 4763 scope.go:117] "RemoveContainer" containerID="4fbfcf01e5e6efe52192419ee523afb90c3fb0f297785aa0b70f01b32f38e5b9" Dec 05 13:33:55 crc kubenswrapper[4763]: I1205 13:33:55.057495 4763 scope.go:117] "RemoveContainer" containerID="69cc92063efa3216ef9d633331d008ac9a9b441475d3ba941292c828e1948c6b" Dec 05 13:34:32 crc kubenswrapper[4763]: I1205 13:34:32.180074 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" podUID="43f191ee-e0a3-4d9e-a63a-c9b7a626806f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.98:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:34:32 crc kubenswrapper[4763]: I1205 13:34:32.180100 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-7d6cc4d8dc-wf9nm" podUID="43f191ee-e0a3-4d9e-a63a-c9b7a626806f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.98:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:35:07 crc kubenswrapper[4763]: I1205 13:35:07.543922 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:35:07 crc kubenswrapper[4763]: I1205 13:35:07.544605 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:35:37 crc kubenswrapper[4763]: I1205 13:35:37.544643 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:35:37 crc kubenswrapper[4763]: I1205 13:35:37.545219 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:36:07 crc kubenswrapper[4763]: I1205 13:36:07.544095 4763 patch_prober.go:28] interesting pod/machine-config-daemon-xpgln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 13:36:07 crc kubenswrapper[4763]: I1205 13:36:07.544855 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 13:36:07 crc kubenswrapper[4763]: I1205 13:36:07.544921 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" Dec 05 13:36:07 crc kubenswrapper[4763]: I1205 13:36:07.546393 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773"} pod="openshift-machine-config-operator/machine-config-daemon-xpgln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 13:36:07 crc kubenswrapper[4763]: I1205 13:36:07.546494 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" containerName="machine-config-daemon" containerID="cri-o://d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" gracePeriod=600 Dec 05 13:36:08 crc kubenswrapper[4763]: E1205 13:36:08.388643 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:36:08 crc kubenswrapper[4763]: I1205 13:36:08.425496 4763 generic.go:334] "Generic (PLEG): container finished" podID="96338136-6831-49d0-9eb9-77d1205c6afb" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" exitCode=0 Dec 05 13:36:08 crc kubenswrapper[4763]: I1205 13:36:08.425563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" event={"ID":"96338136-6831-49d0-9eb9-77d1205c6afb","Type":"ContainerDied","Data":"d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773"} Dec 05 13:36:08 crc kubenswrapper[4763]: I1205 13:36:08.425619 4763 scope.go:117] "RemoveContainer" containerID="3f799c2e793a1b2bd6da0778b0f00795b959c2faf61439fc3c57c8d3e913b237" Dec 05 13:36:08 crc kubenswrapper[4763]: I1205 13:36:08.426537 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:36:08 crc kubenswrapper[4763]: E1205 13:36:08.426917 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:36:18 crc kubenswrapper[4763]: I1205 13:36:18.785115 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:36:18 crc kubenswrapper[4763]: E1205 13:36:18.785872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:36:29 crc kubenswrapper[4763]: I1205 13:36:29.791547 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:36:29 crc kubenswrapper[4763]: E1205 13:36:29.793293 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:36:43 crc kubenswrapper[4763]: I1205 13:36:43.787952 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:36:43 crc kubenswrapper[4763]: E1205 13:36:43.788682 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:36:58 crc kubenswrapper[4763]: I1205 13:36:58.784834 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:36:58 crc kubenswrapper[4763]: E1205 13:36:58.785693 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb" Dec 05 13:37:11 crc kubenswrapper[4763]: I1205 13:37:11.784671 4763 scope.go:117] "RemoveContainer" containerID="d96fa2b30107c04606b664ce368114e48099c46078e23fe103ebec19c39ed773" Dec 05 13:37:11 crc kubenswrapper[4763]: E1205 13:37:11.785590 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpgln_openshift-machine-config-operator(96338136-6831-49d0-9eb9-77d1205c6afb)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpgln" podUID="96338136-6831-49d0-9eb9-77d1205c6afb"